00:00:00.001 Started by upstream project "autotest-per-patch" build number 126159 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "jbp-per-patch" build number 23859 00:00:00.002 originally caused by: 00:00:00.002 Started by user sys_sgci 00:00:00.012 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.013 The recommended git tool is: git 00:00:00.013 using credential 00000000-0000-0000-0000-000000000002 00:00:00.015 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.030 Fetching changes from the remote Git repository 00:00:00.033 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.054 Using shallow fetch with depth 1 00:00:00.054 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.054 > git --version # timeout=10 00:00:00.078 > git --version # 'git version 2.39.2' 00:00:00.078 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.131 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.131 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/changes/75/21875/22 # timeout=5 00:00:02.695 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.707 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.718 Checking out Revision 8c6732c9e0fe7c9c74cd1fb560a619e554726af3 (FETCH_HEAD) 00:00:02.718 > git config core.sparsecheckout # timeout=10 00:00:02.732 > git read-tree -mu HEAD # timeout=10 00:00:02.754 > git checkout -f 8c6732c9e0fe7c9c74cd1fb560a619e554726af3 # timeout=5 00:00:02.776 Commit message: "jenkins/jjb-config: Remove SPDK_TEST_RELEASE_BUILD from packaging job" 00:00:02.776 > git rev-list --no-walk b0ebb039b16703d64cc7534b6e0fa0780ed1e683 # timeout=10 00:00:02.886 [Pipeline] Start of Pipeline 00:00:02.901 [Pipeline] library 00:00:02.903 Loading library shm_lib@master 00:00:02.903 Library shm_lib@master is cached. Copying from home. 00:00:02.920 [Pipeline] node 00:00:02.932 Running on WFP50 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.934 [Pipeline] { 00:00:02.948 [Pipeline] catchError 00:00:02.951 [Pipeline] { 00:00:02.968 [Pipeline] wrap 00:00:02.978 [Pipeline] { 00:00:02.986 [Pipeline] stage 00:00:02.987 [Pipeline] { (Prologue) 00:00:03.171 [Pipeline] sh 00:00:03.456 + logger -p user.info -t JENKINS-CI 00:00:03.474 [Pipeline] echo 00:00:03.476 Node: WFP50 00:00:03.484 [Pipeline] sh 00:00:03.781 [Pipeline] setCustomBuildProperty 00:00:03.793 [Pipeline] echo 00:00:03.794 Cleanup processes 00:00:03.800 [Pipeline] sh 00:00:04.185 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.185 316491 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.195 [Pipeline] sh 00:00:04.473 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.473 ++ grep -v 'sudo pgrep' 00:00:04.473 ++ awk '{print $1}' 00:00:04.473 + sudo kill -9 00:00:04.473 + true 00:00:04.487 [Pipeline] cleanWs 00:00:04.497 [WS-CLEANUP] Deleting project workspace... 00:00:04.497 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.503 [WS-CLEANUP] done 00:00:04.507 [Pipeline] setCustomBuildProperty 00:00:04.518 [Pipeline] sh 00:00:04.794 + sudo git config --global --replace-all safe.directory '*' 00:00:04.849 [Pipeline] httpRequest 00:00:04.871 [Pipeline] echo 00:00:04.873 Sorcerer 10.211.164.101 is alive 00:00:04.880 [Pipeline] httpRequest 00:00:04.883 HttpMethod: GET 00:00:04.884 URL: http://10.211.164.101/packages/jbp_8c6732c9e0fe7c9c74cd1fb560a619e554726af3.tar.gz 00:00:04.884 Sending request to url: http://10.211.164.101/packages/jbp_8c6732c9e0fe7c9c74cd1fb560a619e554726af3.tar.gz 00:00:04.909 Response Code: HTTP/1.1 200 OK 00:00:04.909 Success: Status code 200 is in the accepted range: 200,404 00:00:04.910 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_8c6732c9e0fe7c9c74cd1fb560a619e554726af3.tar.gz 00:00:28.950 [Pipeline] sh 00:00:29.233 + tar --no-same-owner -xf jbp_8c6732c9e0fe7c9c74cd1fb560a619e554726af3.tar.gz 00:00:29.251 [Pipeline] httpRequest 00:00:29.268 [Pipeline] echo 00:00:29.269 Sorcerer 10.211.164.101 is alive 00:00:29.278 [Pipeline] httpRequest 00:00:29.282 HttpMethod: GET 00:00:29.283 URL: http://10.211.164.101/packages/spdk_719d03c6adf20011bb50ac4109e0be7741c0d1c5.tar.gz 00:00:29.283 Sending request to url: http://10.211.164.101/packages/spdk_719d03c6adf20011bb50ac4109e0be7741c0d1c5.tar.gz 00:00:29.292 Response Code: HTTP/1.1 200 OK 00:00:29.292 Success: Status code 200 is in the accepted range: 200,404 00:00:29.293 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_719d03c6adf20011bb50ac4109e0be7741c0d1c5.tar.gz 00:01:50.881 [Pipeline] sh 00:01:51.164 + tar --no-same-owner -xf spdk_719d03c6adf20011bb50ac4109e0be7741c0d1c5.tar.gz 00:01:55.360 [Pipeline] sh 00:01:55.646 + git -C spdk log --oneline -n5 00:01:55.646 719d03c6a sock/uring: only register net impl if supported 00:01:55.646 e64f085ad vbdev_lvol_ut: unify usage of dummy base bdev 00:01:55.646 9937c0160 lib/rdma: bind TRACE_BDEV_IO_START/DONE to OBJECT_NVMF_RDMA_IO 00:01:55.646 6c7c1f57e accel: add sequence outstanding stat 00:01:55.646 3bc8e6a26 accel: add utility to put task 00:01:55.660 [Pipeline] } 00:01:55.679 [Pipeline] // stage 00:01:55.688 [Pipeline] stage 00:01:55.691 [Pipeline] { (Prepare) 00:01:55.711 [Pipeline] writeFile 00:01:55.728 [Pipeline] sh 00:01:56.009 + logger -p user.info -t JENKINS-CI 00:01:56.022 [Pipeline] sh 00:01:56.304 + logger -p user.info -t JENKINS-CI 00:01:56.316 [Pipeline] sh 00:01:56.596 + cat autorun-spdk.conf 00:01:56.596 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:56.596 SPDK_TEST_BLOCKDEV=1 00:01:56.596 SPDK_TEST_ISAL=1 00:01:56.596 SPDK_TEST_CRYPTO=1 00:01:56.596 SPDK_TEST_REDUCE=1 00:01:56.596 SPDK_TEST_VBDEV_COMPRESS=1 00:01:56.596 SPDK_RUN_UBSAN=1 00:01:56.604 RUN_NIGHTLY=0 00:01:56.610 [Pipeline] readFile 00:01:56.639 [Pipeline] withEnv 00:01:56.640 [Pipeline] { 00:01:56.650 [Pipeline] sh 00:01:56.929 + set -ex 00:01:56.929 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:56.929 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:56.929 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:56.929 ++ SPDK_TEST_BLOCKDEV=1 00:01:56.929 ++ SPDK_TEST_ISAL=1 00:01:56.929 ++ SPDK_TEST_CRYPTO=1 00:01:56.929 ++ SPDK_TEST_REDUCE=1 00:01:56.929 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:56.929 ++ SPDK_RUN_UBSAN=1 00:01:56.929 ++ RUN_NIGHTLY=0 00:01:56.929 + case $SPDK_TEST_NVMF_NICS in 00:01:56.929 + DRIVERS= 00:01:56.929 + [[ -n '' ]] 00:01:56.929 + exit 0 00:01:56.938 [Pipeline] } 00:01:56.955 [Pipeline] // withEnv 00:01:56.960 [Pipeline] } 00:01:56.976 [Pipeline] // stage 00:01:56.987 [Pipeline] catchError 00:01:56.988 [Pipeline] { 00:01:57.004 [Pipeline] timeout 00:01:57.004 Timeout set to expire in 40 min 00:01:57.006 [Pipeline] { 00:01:57.022 [Pipeline] stage 00:01:57.025 [Pipeline] { (Tests) 00:01:57.040 [Pipeline] sh 00:01:57.320 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:57.320 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:57.320 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:57.320 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:57.320 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:57.320 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:57.320 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:57.321 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:57.321 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:57.321 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:57.321 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:57.321 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:57.321 + source /etc/os-release 00:01:57.321 ++ NAME='Fedora Linux' 00:01:57.321 ++ VERSION='38 (Cloud Edition)' 00:01:57.321 ++ ID=fedora 00:01:57.321 ++ VERSION_ID=38 00:01:57.321 ++ VERSION_CODENAME= 00:01:57.321 ++ PLATFORM_ID=platform:f38 00:01:57.321 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:57.321 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:57.321 ++ LOGO=fedora-logo-icon 00:01:57.321 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:57.321 ++ HOME_URL=https://fedoraproject.org/ 00:01:57.321 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:57.321 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:57.321 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:57.321 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:57.321 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:57.321 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:57.321 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:57.321 ++ SUPPORT_END=2024-05-14 00:01:57.321 ++ VARIANT='Cloud Edition' 00:01:57.321 ++ VARIANT_ID=cloud 00:01:57.321 + uname -a 00:01:57.321 Linux spdk-wfp-50 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:57.321 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:02:00.611 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:02:00.611 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:02:00.611 Hugepages 00:02:00.611 node hugesize free / total 00:02:00.611 node0 1048576kB 0 / 0 00:02:00.611 node0 2048kB 0 / 0 00:02:00.611 node1 1048576kB 0 / 0 00:02:00.611 node1 2048kB 0 / 0 00:02:00.611 00:02:00.611 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:00.611 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:00.611 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:00.611 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:00.611 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:00.611 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:00.611 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:00.611 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:00.611 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:00.611 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:02:00.611 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:00.611 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:00.611 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:00.611 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:00.611 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:00.611 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:00.611 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:00.870 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:00.870 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:02:00.870 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:02:00.870 + rm -f /tmp/spdk-ld-path 00:02:00.870 + source autorun-spdk.conf 00:02:00.870 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:00.870 ++ SPDK_TEST_BLOCKDEV=1 00:02:00.870 ++ SPDK_TEST_ISAL=1 00:02:00.870 ++ SPDK_TEST_CRYPTO=1 00:02:00.870 ++ SPDK_TEST_REDUCE=1 00:02:00.870 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:02:00.870 ++ SPDK_RUN_UBSAN=1 00:02:00.870 ++ RUN_NIGHTLY=0 00:02:00.870 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:00.870 + [[ -n '' ]] 00:02:00.870 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:00.870 + for M in /var/spdk/build-*-manifest.txt 00:02:00.870 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:00.870 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:00.870 + for M in /var/spdk/build-*-manifest.txt 00:02:00.870 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:00.870 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:00.870 ++ uname 00:02:00.870 + [[ Linux == \L\i\n\u\x ]] 00:02:00.870 + sudo dmesg -T 00:02:00.870 + sudo dmesg --clear 00:02:00.870 + dmesg_pid=317971 00:02:00.870 + [[ Fedora Linux == FreeBSD ]] 00:02:00.870 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:00.870 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:00.870 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:00.870 + [[ -x /usr/src/fio-static/fio ]] 00:02:00.870 + export FIO_BIN=/usr/src/fio-static/fio 00:02:00.870 + FIO_BIN=/usr/src/fio-static/fio 00:02:00.870 + sudo dmesg -Tw 00:02:00.870 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:00.870 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:00.870 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:00.870 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:00.870 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:00.870 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:00.870 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:00.870 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:00.870 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:02:00.870 Test configuration: 00:02:00.870 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:00.870 SPDK_TEST_BLOCKDEV=1 00:02:00.870 SPDK_TEST_ISAL=1 00:02:00.870 SPDK_TEST_CRYPTO=1 00:02:00.870 SPDK_TEST_REDUCE=1 00:02:00.870 SPDK_TEST_VBDEV_COMPRESS=1 00:02:00.870 SPDK_RUN_UBSAN=1 00:02:00.870 RUN_NIGHTLY=0 10:09:38 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:01.129 10:09:38 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:01.129 10:09:38 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:01.129 10:09:38 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:01.129 10:09:38 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:01.129 10:09:38 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:01.129 10:09:38 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:01.129 10:09:38 -- paths/export.sh@5 -- $ export PATH 00:02:01.129 10:09:38 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:01.129 10:09:38 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:01.129 10:09:38 -- common/autobuild_common.sh@444 -- $ date +%s 00:02:01.129 10:09:38 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721030978.XXXXXX 00:02:01.129 10:09:38 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721030978.E20apV 00:02:01.129 10:09:38 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:02:01.129 10:09:38 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:02:01.129 10:09:38 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:02:01.129 10:09:38 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:01.129 10:09:38 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:01.129 10:09:38 -- common/autobuild_common.sh@460 -- $ get_config_params 00:02:01.129 10:09:38 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:02:01.129 10:09:38 -- common/autotest_common.sh@10 -- $ set +x 00:02:01.129 10:09:38 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:02:01.129 10:09:38 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:02:01.129 10:09:38 -- pm/common@17 -- $ local monitor 00:02:01.129 10:09:38 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:01.129 10:09:38 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:01.129 10:09:38 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:01.129 10:09:38 -- pm/common@21 -- $ date +%s 00:02:01.129 10:09:38 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:01.129 10:09:38 -- pm/common@21 -- $ date +%s 00:02:01.129 10:09:38 -- pm/common@25 -- $ sleep 1 00:02:01.129 10:09:38 -- pm/common@21 -- $ date +%s 00:02:01.129 10:09:38 -- pm/common@21 -- $ date +%s 00:02:01.129 10:09:38 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721030978 00:02:01.129 10:09:38 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721030978 00:02:01.129 10:09:38 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721030978 00:02:01.129 10:09:38 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721030978 00:02:01.129 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721030978_collect-vmstat.pm.log 00:02:01.129 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721030978_collect-cpu-load.pm.log 00:02:01.129 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721030978_collect-cpu-temp.pm.log 00:02:01.129 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721030978_collect-bmc-pm.bmc.pm.log 00:02:02.099 10:09:39 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:02:02.099 10:09:39 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:02.099 10:09:39 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:02.099 10:09:39 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:02.099 10:09:39 -- spdk/autobuild.sh@16 -- $ date -u 00:02:02.099 Mon Jul 15 08:09:39 AM UTC 2024 00:02:02.099 10:09:39 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:02.099 v24.09-pre-202-g719d03c6a 00:02:02.099 10:09:39 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:02.099 10:09:39 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:02.099 10:09:39 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:02.099 10:09:39 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:02:02.099 10:09:39 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:02.099 10:09:39 -- common/autotest_common.sh@10 -- $ set +x 00:02:02.099 ************************************ 00:02:02.099 START TEST ubsan 00:02:02.099 ************************************ 00:02:02.099 10:09:39 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:02:02.099 using ubsan 00:02:02.099 00:02:02.099 real 0m0.001s 00:02:02.099 user 0m0.000s 00:02:02.099 sys 0m0.000s 00:02:02.099 10:09:39 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:02.099 10:09:39 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:02.099 ************************************ 00:02:02.099 END TEST ubsan 00:02:02.099 ************************************ 00:02:02.099 10:09:39 -- common/autotest_common.sh@1142 -- $ return 0 00:02:02.099 10:09:39 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:02:02.099 10:09:39 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:02.099 10:09:39 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:02.099 10:09:39 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:02:02.099 10:09:39 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:02.099 10:09:39 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:02.099 10:09:39 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:02.099 10:09:39 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:02:02.099 10:09:39 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:02:02.357 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:02:02.357 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:02.616 Using 'verbs' RDMA provider 00:02:18.858 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:33.728 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:33.728 Creating mk/config.mk...done. 00:02:33.728 Creating mk/cc.flags.mk...done. 00:02:33.728 Type 'make' to build. 00:02:33.728 10:10:08 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:02:33.728 10:10:08 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:02:33.728 10:10:08 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:33.728 10:10:08 -- common/autotest_common.sh@10 -- $ set +x 00:02:33.728 ************************************ 00:02:33.728 START TEST make 00:02:33.728 ************************************ 00:02:33.728 10:10:08 make -- common/autotest_common.sh@1123 -- $ make -j72 00:02:33.728 make[1]: Nothing to be done for 'all'. 00:03:12.459 The Meson build system 00:03:12.459 Version: 1.3.1 00:03:12.459 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:03:12.459 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:03:12.459 Build type: native build 00:03:12.459 Program cat found: YES (/usr/bin/cat) 00:03:12.459 Project name: DPDK 00:03:12.459 Project version: 24.03.0 00:03:12.459 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:03:12.459 C linker for the host machine: cc ld.bfd 2.39-16 00:03:12.459 Host machine cpu family: x86_64 00:03:12.459 Host machine cpu: x86_64 00:03:12.460 Message: ## Building in Developer Mode ## 00:03:12.460 Program pkg-config found: YES (/usr/bin/pkg-config) 00:03:12.460 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:03:12.460 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:03:12.460 Program python3 found: YES (/usr/bin/python3) 00:03:12.460 Program cat found: YES (/usr/bin/cat) 00:03:12.460 Compiler for C supports arguments -march=native: YES 00:03:12.460 Checking for size of "void *" : 8 00:03:12.460 Checking for size of "void *" : 8 (cached) 00:03:12.460 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:03:12.460 Library m found: YES 00:03:12.460 Library numa found: YES 00:03:12.460 Has header "numaif.h" : YES 00:03:12.460 Library fdt found: NO 00:03:12.460 Library execinfo found: NO 00:03:12.460 Has header "execinfo.h" : YES 00:03:12.460 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:12.460 Run-time dependency libarchive found: NO (tried pkgconfig) 00:03:12.460 Run-time dependency libbsd found: NO (tried pkgconfig) 00:03:12.460 Run-time dependency jansson found: NO (tried pkgconfig) 00:03:12.460 Run-time dependency openssl found: YES 3.0.9 00:03:12.460 Run-time dependency libpcap found: YES 1.10.4 00:03:12.460 Has header "pcap.h" with dependency libpcap: YES 00:03:12.460 Compiler for C supports arguments -Wcast-qual: YES 00:03:12.460 Compiler for C supports arguments -Wdeprecated: YES 00:03:12.460 Compiler for C supports arguments -Wformat: YES 00:03:12.460 Compiler for C supports arguments -Wformat-nonliteral: NO 00:03:12.460 Compiler for C supports arguments -Wformat-security: NO 00:03:12.460 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:12.460 Compiler for C supports arguments -Wmissing-prototypes: YES 00:03:12.460 Compiler for C supports arguments -Wnested-externs: YES 00:03:12.460 Compiler for C supports arguments -Wold-style-definition: YES 00:03:12.460 Compiler for C supports arguments -Wpointer-arith: YES 00:03:12.460 Compiler for C supports arguments -Wsign-compare: YES 00:03:12.460 Compiler for C supports arguments -Wstrict-prototypes: YES 00:03:12.460 Compiler for C supports arguments -Wundef: YES 00:03:12.460 Compiler for C supports arguments -Wwrite-strings: YES 00:03:12.460 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:03:12.460 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:03:12.460 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:12.460 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:03:12.460 Program objdump found: YES (/usr/bin/objdump) 00:03:12.460 Compiler for C supports arguments -mavx512f: YES 00:03:12.460 Checking if "AVX512 checking" compiles: YES 00:03:12.460 Fetching value of define "__SSE4_2__" : 1 00:03:12.460 Fetching value of define "__AES__" : 1 00:03:12.460 Fetching value of define "__AVX__" : 1 00:03:12.460 Fetching value of define "__AVX2__" : 1 00:03:12.460 Fetching value of define "__AVX512BW__" : 1 00:03:12.460 Fetching value of define "__AVX512CD__" : 1 00:03:12.460 Fetching value of define "__AVX512DQ__" : 1 00:03:12.460 Fetching value of define "__AVX512F__" : 1 00:03:12.460 Fetching value of define "__AVX512VL__" : 1 00:03:12.460 Fetching value of define "__PCLMUL__" : 1 00:03:12.460 Fetching value of define "__RDRND__" : 1 00:03:12.460 Fetching value of define "__RDSEED__" : 1 00:03:12.460 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:03:12.460 Fetching value of define "__znver1__" : (undefined) 00:03:12.460 Fetching value of define "__znver2__" : (undefined) 00:03:12.460 Fetching value of define "__znver3__" : (undefined) 00:03:12.460 Fetching value of define "__znver4__" : (undefined) 00:03:12.460 Compiler for C supports arguments -Wno-format-truncation: YES 00:03:12.460 Message: lib/log: Defining dependency "log" 00:03:12.460 Message: lib/kvargs: Defining dependency "kvargs" 00:03:12.460 Message: lib/telemetry: Defining dependency "telemetry" 00:03:12.460 Checking for function "getentropy" : NO 00:03:12.460 Message: lib/eal: Defining dependency "eal" 00:03:12.460 Message: lib/ring: Defining dependency "ring" 00:03:12.460 Message: lib/rcu: Defining dependency "rcu" 00:03:12.460 Message: lib/mempool: Defining dependency "mempool" 00:03:12.460 Message: lib/mbuf: Defining dependency "mbuf" 00:03:12.460 Fetching value of define "__PCLMUL__" : 1 (cached) 00:03:12.460 Fetching value of define "__AVX512F__" : 1 (cached) 00:03:12.460 Fetching value of define "__AVX512BW__" : 1 (cached) 00:03:12.460 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:03:12.460 Fetching value of define "__AVX512VL__" : 1 (cached) 00:03:12.460 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:03:12.460 Compiler for C supports arguments -mpclmul: YES 00:03:12.460 Compiler for C supports arguments -maes: YES 00:03:12.460 Compiler for C supports arguments -mavx512f: YES (cached) 00:03:12.460 Compiler for C supports arguments -mavx512bw: YES 00:03:12.460 Compiler for C supports arguments -mavx512dq: YES 00:03:12.460 Compiler for C supports arguments -mavx512vl: YES 00:03:12.460 Compiler for C supports arguments -mvpclmulqdq: YES 00:03:12.460 Compiler for C supports arguments -mavx2: YES 00:03:12.460 Compiler for C supports arguments -mavx: YES 00:03:12.460 Message: lib/net: Defining dependency "net" 00:03:12.460 Message: lib/meter: Defining dependency "meter" 00:03:12.460 Message: lib/ethdev: Defining dependency "ethdev" 00:03:12.460 Message: lib/pci: Defining dependency "pci" 00:03:12.460 Message: lib/cmdline: Defining dependency "cmdline" 00:03:12.460 Message: lib/hash: Defining dependency "hash" 00:03:12.460 Message: lib/timer: Defining dependency "timer" 00:03:12.460 Message: lib/compressdev: Defining dependency "compressdev" 00:03:12.460 Message: lib/cryptodev: Defining dependency "cryptodev" 00:03:12.460 Message: lib/dmadev: Defining dependency "dmadev" 00:03:12.460 Compiler for C supports arguments -Wno-cast-qual: YES 00:03:12.460 Message: lib/power: Defining dependency "power" 00:03:12.460 Message: lib/reorder: Defining dependency "reorder" 00:03:12.460 Message: lib/security: Defining dependency "security" 00:03:12.460 Has header "linux/userfaultfd.h" : YES 00:03:12.460 Has header "linux/vduse.h" : YES 00:03:12.460 Message: lib/vhost: Defining dependency "vhost" 00:03:12.460 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:03:12.460 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:03:12.460 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:03:12.460 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:03:12.460 Compiler for C supports arguments -std=c11: YES 00:03:12.460 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:03:12.460 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:03:12.460 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:03:12.460 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:03:12.460 Run-time dependency libmlx5 found: YES 1.24.44.0 00:03:12.460 Run-time dependency libibverbs found: YES 1.14.44.0 00:03:12.460 Library mtcr_ul found: NO 00:03:12.460 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:03:12.460 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:03:14.993 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:03:14.993 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:03:14.993 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:03:14.994 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:03:14.994 Configuring mlx5_autoconf.h using configuration 00:03:14.994 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:03:14.994 Run-time dependency libcrypto found: YES 3.0.9 00:03:14.994 Library IPSec_MB found: YES 00:03:14.994 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:03:14.994 Message: drivers/common/qat: Defining dependency "common_qat" 00:03:14.994 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:03:14.994 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:03:14.994 Library IPSec_MB found: YES 00:03:14.994 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:03:14.994 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:03:14.994 Compiler for C supports arguments -std=c11: YES (cached) 00:03:14.994 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:14.994 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:14.994 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:14.994 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:14.994 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:03:14.994 Run-time dependency libisal found: NO (tried pkgconfig) 00:03:14.994 Library libisal found: NO 00:03:14.994 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:03:14.994 Compiler for C supports arguments -std=c11: YES (cached) 00:03:14.994 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:14.994 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:14.994 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:14.994 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:14.994 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:03:14.994 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:03:14.994 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:03:14.994 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:03:14.994 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:03:14.994 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:03:14.994 Program doxygen found: YES (/usr/bin/doxygen) 00:03:14.994 Configuring doxy-api-html.conf using configuration 00:03:14.994 Configuring doxy-api-man.conf using configuration 00:03:14.994 Program mandb found: YES (/usr/bin/mandb) 00:03:14.994 Program sphinx-build found: NO 00:03:14.994 Configuring rte_build_config.h using configuration 00:03:14.994 Message: 00:03:14.994 ================= 00:03:14.994 Applications Enabled 00:03:14.994 ================= 00:03:14.994 00:03:14.994 apps: 00:03:14.994 00:03:14.994 00:03:14.994 Message: 00:03:14.994 ================= 00:03:14.994 Libraries Enabled 00:03:14.994 ================= 00:03:14.994 00:03:14.994 libs: 00:03:14.994 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:03:14.994 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:03:14.994 cryptodev, dmadev, power, reorder, security, vhost, 00:03:14.994 00:03:14.994 Message: 00:03:14.994 =============== 00:03:14.994 Drivers Enabled 00:03:14.994 =============== 00:03:14.994 00:03:14.994 common: 00:03:14.994 mlx5, qat, 00:03:14.994 bus: 00:03:14.994 auxiliary, pci, vdev, 00:03:14.994 mempool: 00:03:14.994 ring, 00:03:14.994 dma: 00:03:14.994 00:03:14.994 net: 00:03:14.994 00:03:14.994 crypto: 00:03:14.994 ipsec_mb, mlx5, 00:03:14.994 compress: 00:03:14.994 isal, mlx5, 00:03:14.994 vdpa: 00:03:14.994 00:03:14.994 00:03:14.994 Message: 00:03:14.994 ================= 00:03:14.994 Content Skipped 00:03:14.994 ================= 00:03:14.994 00:03:14.994 apps: 00:03:14.994 dumpcap: explicitly disabled via build config 00:03:14.994 graph: explicitly disabled via build config 00:03:14.994 pdump: explicitly disabled via build config 00:03:14.994 proc-info: explicitly disabled via build config 00:03:14.994 test-acl: explicitly disabled via build config 00:03:14.994 test-bbdev: explicitly disabled via build config 00:03:14.994 test-cmdline: explicitly disabled via build config 00:03:14.994 test-compress-perf: explicitly disabled via build config 00:03:14.994 test-crypto-perf: explicitly disabled via build config 00:03:14.994 test-dma-perf: explicitly disabled via build config 00:03:14.994 test-eventdev: explicitly disabled via build config 00:03:14.994 test-fib: explicitly disabled via build config 00:03:14.994 test-flow-perf: explicitly disabled via build config 00:03:14.994 test-gpudev: explicitly disabled via build config 00:03:14.994 test-mldev: explicitly disabled via build config 00:03:14.994 test-pipeline: explicitly disabled via build config 00:03:14.994 test-pmd: explicitly disabled via build config 00:03:14.994 test-regex: explicitly disabled via build config 00:03:14.994 test-sad: explicitly disabled via build config 00:03:14.994 test-security-perf: explicitly disabled via build config 00:03:14.994 00:03:14.994 libs: 00:03:14.994 argparse: explicitly disabled via build config 00:03:14.994 metrics: explicitly disabled via build config 00:03:14.994 acl: explicitly disabled via build config 00:03:14.994 bbdev: explicitly disabled via build config 00:03:14.994 bitratestats: explicitly disabled via build config 00:03:14.994 bpf: explicitly disabled via build config 00:03:14.994 cfgfile: explicitly disabled via build config 00:03:14.994 distributor: explicitly disabled via build config 00:03:14.994 efd: explicitly disabled via build config 00:03:14.994 eventdev: explicitly disabled via build config 00:03:14.994 dispatcher: explicitly disabled via build config 00:03:14.994 gpudev: explicitly disabled via build config 00:03:14.994 gro: explicitly disabled via build config 00:03:14.994 gso: explicitly disabled via build config 00:03:14.994 ip_frag: explicitly disabled via build config 00:03:14.994 jobstats: explicitly disabled via build config 00:03:14.994 latencystats: explicitly disabled via build config 00:03:14.994 lpm: explicitly disabled via build config 00:03:14.994 member: explicitly disabled via build config 00:03:14.994 pcapng: explicitly disabled via build config 00:03:14.994 rawdev: explicitly disabled via build config 00:03:14.994 regexdev: explicitly disabled via build config 00:03:14.994 mldev: explicitly disabled via build config 00:03:14.994 rib: explicitly disabled via build config 00:03:14.994 sched: explicitly disabled via build config 00:03:14.994 stack: explicitly disabled via build config 00:03:14.994 ipsec: explicitly disabled via build config 00:03:14.994 pdcp: explicitly disabled via build config 00:03:14.994 fib: explicitly disabled via build config 00:03:14.994 port: explicitly disabled via build config 00:03:14.994 pdump: explicitly disabled via build config 00:03:14.994 table: explicitly disabled via build config 00:03:14.994 pipeline: explicitly disabled via build config 00:03:14.994 graph: explicitly disabled via build config 00:03:14.994 node: explicitly disabled via build config 00:03:14.994 00:03:14.994 drivers: 00:03:14.994 common/cpt: not in enabled drivers build config 00:03:14.994 common/dpaax: not in enabled drivers build config 00:03:14.994 common/iavf: not in enabled drivers build config 00:03:14.994 common/idpf: not in enabled drivers build config 00:03:14.994 common/ionic: not in enabled drivers build config 00:03:14.994 common/mvep: not in enabled drivers build config 00:03:14.994 common/octeontx: not in enabled drivers build config 00:03:14.994 bus/cdx: not in enabled drivers build config 00:03:14.994 bus/dpaa: not in enabled drivers build config 00:03:14.994 bus/fslmc: not in enabled drivers build config 00:03:14.994 bus/ifpga: not in enabled drivers build config 00:03:14.994 bus/platform: not in enabled drivers build config 00:03:14.994 bus/uacce: not in enabled drivers build config 00:03:14.994 bus/vmbus: not in enabled drivers build config 00:03:14.994 common/cnxk: not in enabled drivers build config 00:03:14.994 common/nfp: not in enabled drivers build config 00:03:14.994 common/nitrox: not in enabled drivers build config 00:03:14.995 common/sfc_efx: not in enabled drivers build config 00:03:14.995 mempool/bucket: not in enabled drivers build config 00:03:14.995 mempool/cnxk: not in enabled drivers build config 00:03:14.995 mempool/dpaa: not in enabled drivers build config 00:03:14.995 mempool/dpaa2: not in enabled drivers build config 00:03:14.995 mempool/octeontx: not in enabled drivers build config 00:03:14.995 mempool/stack: not in enabled drivers build config 00:03:14.995 dma/cnxk: not in enabled drivers build config 00:03:14.995 dma/dpaa: not in enabled drivers build config 00:03:14.995 dma/dpaa2: not in enabled drivers build config 00:03:14.995 dma/hisilicon: not in enabled drivers build config 00:03:14.995 dma/idxd: not in enabled drivers build config 00:03:14.995 dma/ioat: not in enabled drivers build config 00:03:14.995 dma/skeleton: not in enabled drivers build config 00:03:14.995 net/af_packet: not in enabled drivers build config 00:03:14.995 net/af_xdp: not in enabled drivers build config 00:03:14.995 net/ark: not in enabled drivers build config 00:03:14.995 net/atlantic: not in enabled drivers build config 00:03:14.995 net/avp: not in enabled drivers build config 00:03:14.995 net/axgbe: not in enabled drivers build config 00:03:14.995 net/bnx2x: not in enabled drivers build config 00:03:14.995 net/bnxt: not in enabled drivers build config 00:03:14.995 net/bonding: not in enabled drivers build config 00:03:14.995 net/cnxk: not in enabled drivers build config 00:03:14.995 net/cpfl: not in enabled drivers build config 00:03:14.995 net/cxgbe: not in enabled drivers build config 00:03:14.995 net/dpaa: not in enabled drivers build config 00:03:14.995 net/dpaa2: not in enabled drivers build config 00:03:14.995 net/e1000: not in enabled drivers build config 00:03:14.995 net/ena: not in enabled drivers build config 00:03:14.995 net/enetc: not in enabled drivers build config 00:03:14.995 net/enetfec: not in enabled drivers build config 00:03:14.995 net/enic: not in enabled drivers build config 00:03:14.995 net/failsafe: not in enabled drivers build config 00:03:14.995 net/fm10k: not in enabled drivers build config 00:03:14.995 net/gve: not in enabled drivers build config 00:03:14.995 net/hinic: not in enabled drivers build config 00:03:14.995 net/hns3: not in enabled drivers build config 00:03:14.995 net/i40e: not in enabled drivers build config 00:03:14.995 net/iavf: not in enabled drivers build config 00:03:14.995 net/ice: not in enabled drivers build config 00:03:14.995 net/idpf: not in enabled drivers build config 00:03:14.995 net/igc: not in enabled drivers build config 00:03:14.995 net/ionic: not in enabled drivers build config 00:03:14.995 net/ipn3ke: not in enabled drivers build config 00:03:14.995 net/ixgbe: not in enabled drivers build config 00:03:14.995 net/mana: not in enabled drivers build config 00:03:14.995 net/memif: not in enabled drivers build config 00:03:14.995 net/mlx4: not in enabled drivers build config 00:03:14.995 net/mlx5: not in enabled drivers build config 00:03:14.995 net/mvneta: not in enabled drivers build config 00:03:14.995 net/mvpp2: not in enabled drivers build config 00:03:14.995 net/netvsc: not in enabled drivers build config 00:03:14.995 net/nfb: not in enabled drivers build config 00:03:14.995 net/nfp: not in enabled drivers build config 00:03:14.995 net/ngbe: not in enabled drivers build config 00:03:14.995 net/null: not in enabled drivers build config 00:03:14.995 net/octeontx: not in enabled drivers build config 00:03:14.995 net/octeon_ep: not in enabled drivers build config 00:03:14.995 net/pcap: not in enabled drivers build config 00:03:14.995 net/pfe: not in enabled drivers build config 00:03:14.995 net/qede: not in enabled drivers build config 00:03:14.995 net/ring: not in enabled drivers build config 00:03:14.995 net/sfc: not in enabled drivers build config 00:03:14.995 net/softnic: not in enabled drivers build config 00:03:14.995 net/tap: not in enabled drivers build config 00:03:14.995 net/thunderx: not in enabled drivers build config 00:03:14.995 net/txgbe: not in enabled drivers build config 00:03:14.995 net/vdev_netvsc: not in enabled drivers build config 00:03:14.995 net/vhost: not in enabled drivers build config 00:03:14.995 net/virtio: not in enabled drivers build config 00:03:14.995 net/vmxnet3: not in enabled drivers build config 00:03:14.995 raw/*: missing internal dependency, "rawdev" 00:03:14.995 crypto/armv8: not in enabled drivers build config 00:03:14.995 crypto/bcmfs: not in enabled drivers build config 00:03:14.995 crypto/caam_jr: not in enabled drivers build config 00:03:14.995 crypto/ccp: not in enabled drivers build config 00:03:14.995 crypto/cnxk: not in enabled drivers build config 00:03:14.995 crypto/dpaa_sec: not in enabled drivers build config 00:03:14.995 crypto/dpaa2_sec: not in enabled drivers build config 00:03:14.995 crypto/mvsam: not in enabled drivers build config 00:03:14.995 crypto/nitrox: not in enabled drivers build config 00:03:14.995 crypto/null: not in enabled drivers build config 00:03:14.995 crypto/octeontx: not in enabled drivers build config 00:03:14.995 crypto/openssl: not in enabled drivers build config 00:03:14.995 crypto/scheduler: not in enabled drivers build config 00:03:14.995 crypto/uadk: not in enabled drivers build config 00:03:14.995 crypto/virtio: not in enabled drivers build config 00:03:14.995 compress/nitrox: not in enabled drivers build config 00:03:14.995 compress/octeontx: not in enabled drivers build config 00:03:14.995 compress/zlib: not in enabled drivers build config 00:03:14.995 regex/*: missing internal dependency, "regexdev" 00:03:14.995 ml/*: missing internal dependency, "mldev" 00:03:14.995 vdpa/ifc: not in enabled drivers build config 00:03:14.995 vdpa/mlx5: not in enabled drivers build config 00:03:14.995 vdpa/nfp: not in enabled drivers build config 00:03:14.995 vdpa/sfc: not in enabled drivers build config 00:03:14.995 event/*: missing internal dependency, "eventdev" 00:03:14.995 baseband/*: missing internal dependency, "bbdev" 00:03:14.995 gpu/*: missing internal dependency, "gpudev" 00:03:14.995 00:03:14.995 00:03:15.561 Build targets in project: 115 00:03:15.561 00:03:15.561 DPDK 24.03.0 00:03:15.561 00:03:15.561 User defined options 00:03:15.561 buildtype : debug 00:03:15.561 default_library : shared 00:03:15.561 libdir : lib 00:03:15.561 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:03:15.561 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:03:15.561 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:03:15.561 cpu_instruction_set: native 00:03:15.561 disable_apps : test-dma-perf,test,test-sad,test-acl,test-pmd,test-mldev,test-compress-perf,test-cmdline,test-regex,test-fib,graph,test-bbdev,dumpcap,test-gpudev,proc-info,test-pipeline,test-flow-perf,test-crypto-perf,pdump,test-eventdev,test-security-perf 00:03:15.561 disable_libs : port,lpm,ipsec,regexdev,dispatcher,argparse,bitratestats,rawdev,stack,graph,acl,bbdev,pipeline,member,sched,pcapng,mldev,eventdev,efd,metrics,latencystats,cfgfile,ip_frag,jobstats,pdump,pdcp,rib,node,fib,distributor,gso,table,bpf,gpudev,gro 00:03:15.561 enable_docs : false 00:03:15.561 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:03:15.561 enable_kmods : false 00:03:15.561 max_lcores : 128 00:03:15.561 tests : false 00:03:15.561 00:03:15.561 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:15.819 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:03:16.106 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:03:16.106 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:03:16.106 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:03:16.106 [4/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:03:16.106 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:03:16.106 [6/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:03:16.106 [7/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:03:16.106 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:03:16.106 [9/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:03:16.106 [10/378] Linking static target lib/librte_kvargs.a 00:03:16.106 [11/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:03:16.106 [12/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:03:16.106 [13/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:03:16.106 [14/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:03:16.106 [15/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:03:16.106 [16/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:03:16.106 [17/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:03:16.106 [18/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:03:16.106 [19/378] Linking static target lib/librte_log.a 00:03:16.389 [20/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:03:16.654 [21/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:03:16.654 [22/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:03:16.654 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:03:16.654 [24/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:03:16.654 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:03:16.654 [26/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:03:16.654 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:03:16.654 [28/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:03:16.654 [29/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:03:16.654 [30/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:03:16.654 [31/378] Linking static target lib/librte_ring.a 00:03:16.654 [32/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:03:16.654 [33/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:03:16.654 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:03:16.654 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:03:16.654 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:03:16.654 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:03:16.654 [38/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:03:16.654 [39/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:03:16.654 [40/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:03:16.654 [41/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:03:16.654 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:03:16.654 [43/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:03:16.654 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:03:16.654 [45/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:03:16.654 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:03:16.654 [47/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:03:16.654 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:03:16.654 [49/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:03:16.654 [50/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:03:16.654 [51/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:03:16.654 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:03:16.654 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:03:16.654 [54/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:03:16.654 [55/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:03:16.654 [56/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:03:16.654 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:03:16.654 [58/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:03:16.654 [59/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:03:16.654 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:03:16.654 [61/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:03:16.654 [62/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:03:16.654 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:03:16.654 [64/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:03:16.654 [65/378] Linking static target lib/librte_telemetry.a 00:03:16.654 [66/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:03:16.919 [67/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:03:16.919 [68/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:03:16.919 [69/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:03:16.919 [70/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:03:16.919 [71/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:03:16.919 [72/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:03:16.919 [73/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:03:16.919 [74/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:03:16.919 [75/378] Linking static target lib/librte_pci.a 00:03:16.919 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:03:16.919 [77/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:03:16.919 [78/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:03:16.919 [79/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:03:16.919 [80/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:03:16.919 [81/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:03:16.919 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:03:16.919 [83/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:03:16.919 [84/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:03:16.919 [85/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:03:16.919 [86/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:03:16.919 [87/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:03:16.919 [88/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:03:16.919 [89/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:03:16.919 [90/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:03:16.919 [91/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:03:16.919 [92/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:03:16.919 [93/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:03:16.919 [94/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:03:16.919 [95/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:03:16.919 [96/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:03:16.919 [97/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:03:16.919 [98/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:03:16.919 [99/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:03:16.919 [100/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:03:16.919 [101/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:03:16.919 [102/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:03:16.919 [103/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:03:16.919 [104/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:03:16.919 [105/378] Linking static target lib/librte_mempool.a 00:03:16.919 [106/378] Linking static target lib/librte_rcu.a 00:03:16.919 [107/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:03:16.919 [108/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:03:16.919 [109/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:03:16.919 [110/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:03:17.182 [111/378] Linking static target lib/librte_net.a 00:03:17.182 [112/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:03:17.182 [113/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:03:17.182 [114/378] Linking static target lib/librte_meter.a 00:03:17.182 [115/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:03:17.182 [116/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:03:17.182 [117/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:03:17.182 [118/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:03:17.182 [119/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:03:17.182 [120/378] Linking target lib/librte_log.so.24.1 00:03:17.182 [121/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:17.182 [122/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:03:17.182 [123/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:03:17.182 [124/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:03:17.445 [125/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:03:17.445 [126/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:03:17.445 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:03:17.445 [128/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:03:17.445 [129/378] Linking static target lib/librte_mbuf.a 00:03:17.445 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:03:17.445 [131/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:03:17.445 [132/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:03:17.445 [133/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:03:17.445 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:03:17.445 [135/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:03:17.445 [136/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:03:17.445 [137/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:03:17.445 [138/378] Linking static target lib/librte_cmdline.a 00:03:17.445 [139/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:03:17.445 [140/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:03:17.445 [141/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:03:17.445 [142/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:03:17.445 [143/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:03:17.445 [144/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:03:17.445 [145/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:03:17.445 [146/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:03:17.445 [147/378] Linking static target lib/librte_timer.a 00:03:17.445 [148/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:03:17.445 [149/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:03:17.445 [150/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:03:17.445 [151/378] Linking static target lib/librte_eal.a 00:03:17.445 [152/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:03:17.445 [153/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:03:17.445 [154/378] Linking target lib/librte_kvargs.so.24.1 00:03:17.445 [155/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:03:17.445 [156/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:03:17.445 [157/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:03:17.445 [158/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:03:17.445 [159/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:03:17.445 [160/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:03:17.445 [161/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:03:17.445 [162/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:03:17.445 [163/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:03:17.708 [164/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:03:17.708 [165/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:03:17.708 [166/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:03:17.708 [167/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:03:17.708 [168/378] Linking target lib/librte_telemetry.so.24.1 00:03:17.708 [169/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:03:17.708 [170/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:03:17.708 [171/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:03:17.708 [172/378] Linking static target lib/librte_compressdev.a 00:03:17.708 [173/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:03:17.708 [174/378] Linking static target lib/librte_dmadev.a 00:03:17.708 [175/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:03:17.708 [176/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:03:17.708 [177/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:03:17.708 [178/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:03:17.708 [179/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:03:17.708 [180/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:03:17.708 [181/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:03:17.708 [182/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:03:17.708 [183/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:03:17.708 [184/378] Linking static target lib/librte_power.a 00:03:17.708 [185/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:03:17.708 [186/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:03:17.708 [187/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:03:17.708 [188/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:03:17.708 [189/378] Linking static target lib/librte_reorder.a 00:03:17.708 [190/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:03:17.708 [191/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:03:17.708 [192/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:03:17.708 [193/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:03:17.708 [194/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:03:17.970 [195/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:03:17.970 [196/378] Linking static target lib/librte_security.a 00:03:17.970 [197/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:03:17.970 [198/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:03:17.970 [199/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:03:17.970 [200/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:03:17.970 [201/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:03:17.970 [202/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:03:17.970 [203/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:17.970 [204/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:17.970 [205/378] Linking static target drivers/librte_bus_auxiliary.a 00:03:17.970 [206/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:03:17.970 [207/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:03:17.970 [208/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:03:17.970 [209/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:03:17.970 [210/378] Linking static target lib/librte_hash.a 00:03:18.229 [211/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.229 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:03:18.229 [213/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:03:18.229 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:03:18.229 [215/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.229 [216/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:03:18.229 [217/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:03:18.229 [218/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:03:18.229 [219/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:03:18.229 [220/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:03:18.229 [221/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:03:18.229 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:03:18.229 [223/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:18.229 [224/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:18.229 [225/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:18.229 [226/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:18.229 [227/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:03:18.229 [228/378] Linking static target drivers/librte_bus_vdev.a 00:03:18.229 [229/378] Linking static target drivers/librte_bus_pci.a 00:03:18.229 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:03:18.229 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:03:18.229 [232/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:03:18.229 [233/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:03:18.229 [234/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:03:18.229 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:03:18.229 [236/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:03:18.229 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:03:18.229 [238/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.229 [239/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:03:18.229 [240/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:03:18.229 [241/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.229 [242/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:03:18.229 [243/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:03:18.229 [244/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.229 [245/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:03:18.229 [246/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:03:18.488 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:03:18.488 [248/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.488 [249/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.488 [250/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:03:18.488 [251/378] Linking static target lib/librte_cryptodev.a 00:03:18.488 [252/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:03:18.488 [253/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:03:18.488 [254/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.488 [255/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:03:18.488 [256/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:03:18.488 [257/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:03:18.488 [258/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:03:18.488 [259/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.488 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:03:18.488 [261/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:03:18.488 [262/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:03:18.488 [263/378] Linking static target lib/librte_ethdev.a 00:03:18.488 [264/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.488 [265/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:03:18.746 [266/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:03:18.746 [267/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:03:18.747 [268/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:03:18.747 [269/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:03:18.747 [270/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:03:18.747 [271/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:03:18.747 [272/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:03:18.747 [273/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:03:18.747 [274/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:03:18.747 [275/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.747 [276/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:03:18.747 [277/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:18.747 [278/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:18.747 [279/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:03:18.747 [280/378] Linking static target drivers/librte_mempool_ring.a 00:03:18.747 [281/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:03:18.747 [282/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:03:18.747 [283/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:03:18.747 [284/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:03:18.747 [285/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:03:18.747 [286/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:03:18.747 [287/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:03:18.747 [288/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:03:18.747 [289/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:03:18.747 [290/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:18.747 [291/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:03:19.006 [292/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:03:19.007 [293/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:03:19.007 [294/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:03:19.007 [295/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:03:19.007 [296/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:03:19.007 [297/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:03:19.007 [298/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:19.007 [299/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:19.007 [300/378] Linking static target drivers/librte_crypto_mlx5.a 00:03:19.007 [301/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:03:19.007 [302/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:03:19.007 [303/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:19.007 [304/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:19.007 [305/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:03:19.007 [306/378] Linking static target drivers/librte_common_mlx5.a 00:03:19.007 [307/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:19.266 [308/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:19.266 [309/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:19.266 [310/378] Linking static target drivers/librte_compress_isal.a 00:03:19.266 [311/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:19.266 [312/378] Linking static target drivers/librte_compress_mlx5.a 00:03:19.266 [313/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:03:19.266 [314/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:03:19.266 [315/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:19.266 [316/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:19.266 [317/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:03:19.833 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:03:19.833 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:03:20.091 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:03:20.091 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:20.091 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:20.091 [323/378] Linking static target drivers/librte_common_qat.a 00:03:20.349 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:03:20.608 [325/378] Linking static target lib/librte_vhost.a 00:03:20.608 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:23.151 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:03:25.055 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:03:28.341 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.247 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:03:30.247 [331/378] Linking target lib/librte_eal.so.24.1 00:03:30.247 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:03:30.247 [333/378] Linking target lib/librte_meter.so.24.1 00:03:30.247 [334/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:03:30.247 [335/378] Linking target lib/librte_dmadev.so.24.1 00:03:30.247 [336/378] Linking target lib/librte_ring.so.24.1 00:03:30.247 [337/378] Linking target lib/librte_pci.so.24.1 00:03:30.247 [338/378] Linking target lib/librte_timer.so.24.1 00:03:30.247 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:03:30.504 [340/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:03:30.504 [341/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:03:30.505 [342/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:03:30.505 [343/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:03:30.505 [344/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:03:30.505 [345/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:03:30.505 [346/378] Linking target lib/librte_rcu.so.24.1 00:03:30.505 [347/378] Linking target lib/librte_mempool.so.24.1 00:03:30.505 [348/378] Linking target drivers/librte_bus_pci.so.24.1 00:03:30.505 [349/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:03:30.763 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:03:30.763 [351/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:03:30.763 [352/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:03:30.763 [353/378] Linking target lib/librte_mbuf.so.24.1 00:03:30.763 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:03:31.021 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:03:31.021 [356/378] Linking target lib/librte_cryptodev.so.24.1 00:03:31.021 [357/378] Linking target lib/librte_reorder.so.24.1 00:03:31.021 [358/378] Linking target lib/librte_compressdev.so.24.1 00:03:31.021 [359/378] Linking target lib/librte_net.so.24.1 00:03:31.280 [360/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:03:31.280 [361/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:03:31.280 [362/378] Linking target drivers/librte_compress_isal.so.24.1 00:03:31.280 [363/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:03:31.280 [364/378] Linking target lib/librte_security.so.24.1 00:03:31.280 [365/378] Linking target lib/librte_hash.so.24.1 00:03:31.280 [366/378] Linking target lib/librte_cmdline.so.24.1 00:03:31.280 [367/378] Linking target lib/librte_ethdev.so.24.1 00:03:31.538 [368/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:03:31.538 [369/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:03:31.538 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:03:31.538 [371/378] Linking target drivers/librte_common_mlx5.so.24.1 00:03:31.538 [372/378] Linking target lib/librte_power.so.24.1 00:03:31.538 [373/378] Linking target lib/librte_vhost.so.24.1 00:03:31.797 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:03:31.797 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:03:31.797 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:03:31.797 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:03:31.797 [378/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:03:31.797 INFO: autodetecting backend as ninja 00:03:31.797 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:03:33.175 CC lib/log/log.o 00:03:33.175 CC lib/log/log_flags.o 00:03:33.175 CC lib/log/log_deprecated.o 00:03:33.175 CC lib/ut_mock/mock.o 00:03:33.175 CC lib/ut/ut.o 00:03:33.175 LIB libspdk_log.a 00:03:33.175 LIB libspdk_ut_mock.a 00:03:33.175 LIB libspdk_ut.a 00:03:33.175 SO libspdk_log.so.7.0 00:03:33.175 SO libspdk_ut_mock.so.6.0 00:03:33.175 SO libspdk_ut.so.2.0 00:03:33.433 SYMLINK libspdk_ut_mock.so 00:03:33.433 SYMLINK libspdk_log.so 00:03:33.433 SYMLINK libspdk_ut.so 00:03:33.692 CC lib/util/base64.o 00:03:33.692 CC lib/util/bit_array.o 00:03:33.692 CC lib/util/cpuset.o 00:03:33.692 CC lib/util/crc16.o 00:03:33.692 CC lib/util/crc32.o 00:03:33.692 CC lib/util/crc32c.o 00:03:33.692 CC lib/util/crc32_ieee.o 00:03:33.692 CC lib/util/crc64.o 00:03:33.692 CC lib/util/fd.o 00:03:33.692 CC lib/util/dif.o 00:03:33.692 CC lib/util/file.o 00:03:33.692 CC lib/util/math.o 00:03:33.692 CC lib/util/hexlify.o 00:03:33.692 CC lib/util/iov.o 00:03:33.692 CC lib/util/pipe.o 00:03:33.692 CC lib/util/string.o 00:03:33.692 CC lib/util/uuid.o 00:03:33.692 CC lib/util/strerror_tls.o 00:03:33.692 CC lib/util/fd_group.o 00:03:33.692 CC lib/util/xor.o 00:03:33.692 CC lib/dma/dma.o 00:03:33.692 CXX lib/trace_parser/trace.o 00:03:33.692 CC lib/ioat/ioat.o 00:03:33.692 CC lib/util/zipf.o 00:03:33.950 CC lib/vfio_user/host/vfio_user_pci.o 00:03:33.950 CC lib/vfio_user/host/vfio_user.o 00:03:33.950 LIB libspdk_dma.a 00:03:33.950 LIB libspdk_ioat.a 00:03:33.950 SO libspdk_dma.so.4.0 00:03:33.950 SO libspdk_ioat.so.7.0 00:03:33.950 SYMLINK libspdk_dma.so 00:03:34.208 SYMLINK libspdk_ioat.so 00:03:34.208 LIB libspdk_vfio_user.a 00:03:34.208 SO libspdk_vfio_user.so.5.0 00:03:34.208 LIB libspdk_util.a 00:03:34.208 SYMLINK libspdk_vfio_user.so 00:03:34.467 SO libspdk_util.so.9.1 00:03:34.467 SYMLINK libspdk_util.so 00:03:34.726 LIB libspdk_trace_parser.a 00:03:34.726 SO libspdk_trace_parser.so.5.0 00:03:34.726 SYMLINK libspdk_trace_parser.so 00:03:34.726 CC lib/json/json_util.o 00:03:34.726 CC lib/json/json_write.o 00:03:34.726 CC lib/json/json_parse.o 00:03:34.984 CC lib/vmd/vmd.o 00:03:34.984 CC lib/vmd/led.o 00:03:34.984 CC lib/reduce/reduce.o 00:03:34.984 CC lib/conf/conf.o 00:03:34.984 CC lib/rdma_provider/common.o 00:03:34.984 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:34.984 CC lib/env_dpdk/env.o 00:03:34.984 CC lib/env_dpdk/memory.o 00:03:34.984 CC lib/rdma_utils/rdma_utils.o 00:03:34.984 CC lib/env_dpdk/pci.o 00:03:34.984 CC lib/env_dpdk/init.o 00:03:34.984 CC lib/env_dpdk/threads.o 00:03:34.984 CC lib/env_dpdk/pci_ioat.o 00:03:34.984 CC lib/env_dpdk/pci_virtio.o 00:03:34.984 CC lib/idxd/idxd.o 00:03:34.984 CC lib/env_dpdk/pci_vmd.o 00:03:34.984 CC lib/env_dpdk/pci_idxd.o 00:03:34.984 CC lib/idxd/idxd_user.o 00:03:34.984 CC lib/env_dpdk/pci_event.o 00:03:34.984 CC lib/env_dpdk/pci_dpdk.o 00:03:34.984 CC lib/idxd/idxd_kernel.o 00:03:34.984 CC lib/env_dpdk/sigbus_handler.o 00:03:34.984 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:34.984 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:34.984 LIB libspdk_rdma_provider.a 00:03:35.244 SO libspdk_rdma_provider.so.6.0 00:03:35.244 LIB libspdk_conf.a 00:03:35.244 SO libspdk_conf.so.6.0 00:03:35.244 LIB libspdk_json.a 00:03:35.244 LIB libspdk_rdma_utils.a 00:03:35.244 SYMLINK libspdk_rdma_provider.so 00:03:35.244 SO libspdk_rdma_utils.so.1.0 00:03:35.244 SO libspdk_json.so.6.0 00:03:35.244 SYMLINK libspdk_conf.so 00:03:35.244 SYMLINK libspdk_rdma_utils.so 00:03:35.244 SYMLINK libspdk_json.so 00:03:35.503 LIB libspdk_idxd.a 00:03:35.503 SO libspdk_idxd.so.12.0 00:03:35.503 LIB libspdk_vmd.a 00:03:35.503 LIB libspdk_reduce.a 00:03:35.503 SO libspdk_vmd.so.6.0 00:03:35.503 SO libspdk_reduce.so.6.0 00:03:35.503 SYMLINK libspdk_idxd.so 00:03:35.763 SYMLINK libspdk_reduce.so 00:03:35.763 CC lib/jsonrpc/jsonrpc_server.o 00:03:35.763 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:35.763 CC lib/jsonrpc/jsonrpc_client.o 00:03:35.763 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:35.763 SYMLINK libspdk_vmd.so 00:03:36.054 LIB libspdk_jsonrpc.a 00:03:36.054 SO libspdk_jsonrpc.so.6.0 00:03:36.054 SYMLINK libspdk_jsonrpc.so 00:03:36.312 LIB libspdk_env_dpdk.a 00:03:36.312 SO libspdk_env_dpdk.so.14.1 00:03:36.569 CC lib/rpc/rpc.o 00:03:36.569 SYMLINK libspdk_env_dpdk.so 00:03:36.826 LIB libspdk_rpc.a 00:03:36.826 SO libspdk_rpc.so.6.0 00:03:36.826 SYMLINK libspdk_rpc.so 00:03:37.084 CC lib/trace/trace.o 00:03:37.084 CC lib/trace/trace_flags.o 00:03:37.084 CC lib/trace/trace_rpc.o 00:03:37.084 CC lib/notify/notify.o 00:03:37.084 CC lib/keyring/keyring.o 00:03:37.084 CC lib/notify/notify_rpc.o 00:03:37.084 CC lib/keyring/keyring_rpc.o 00:03:37.341 LIB libspdk_notify.a 00:03:37.341 SO libspdk_notify.so.6.0 00:03:37.341 LIB libspdk_trace.a 00:03:37.341 LIB libspdk_keyring.a 00:03:37.600 SO libspdk_trace.so.10.0 00:03:37.600 SYMLINK libspdk_notify.so 00:03:37.600 SO libspdk_keyring.so.1.0 00:03:37.600 SYMLINK libspdk_trace.so 00:03:37.600 SYMLINK libspdk_keyring.so 00:03:37.857 CC lib/sock/sock.o 00:03:37.857 CC lib/sock/sock_rpc.o 00:03:37.857 CC lib/thread/thread.o 00:03:37.857 CC lib/thread/iobuf.o 00:03:38.791 LIB libspdk_sock.a 00:03:38.791 SO libspdk_sock.so.10.0 00:03:38.791 SYMLINK libspdk_sock.so 00:03:39.358 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:39.358 CC lib/nvme/nvme_ctrlr.o 00:03:39.358 CC lib/nvme/nvme_fabric.o 00:03:39.358 CC lib/nvme/nvme_ns_cmd.o 00:03:39.358 CC lib/nvme/nvme_ns.o 00:03:39.358 CC lib/nvme/nvme_pcie_common.o 00:03:39.358 CC lib/nvme/nvme_pcie.o 00:03:39.358 CC lib/nvme/nvme_qpair.o 00:03:39.358 CC lib/nvme/nvme.o 00:03:39.358 CC lib/nvme/nvme_quirks.o 00:03:39.358 CC lib/nvme/nvme_transport.o 00:03:39.358 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:39.358 CC lib/nvme/nvme_discovery.o 00:03:39.358 CC lib/nvme/nvme_tcp.o 00:03:39.358 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:39.358 CC lib/nvme/nvme_opal.o 00:03:39.358 CC lib/nvme/nvme_io_msg.o 00:03:39.358 CC lib/nvme/nvme_poll_group.o 00:03:39.358 CC lib/nvme/nvme_zns.o 00:03:39.358 CC lib/nvme/nvme_stubs.o 00:03:39.358 CC lib/nvme/nvme_cuse.o 00:03:39.358 CC lib/nvme/nvme_rdma.o 00:03:39.358 CC lib/nvme/nvme_auth.o 00:03:39.615 LIB libspdk_thread.a 00:03:39.615 SO libspdk_thread.so.10.1 00:03:39.615 SYMLINK libspdk_thread.so 00:03:39.874 CC lib/blob/blobstore.o 00:03:39.874 CC lib/blob/request.o 00:03:39.874 CC lib/blob/zeroes.o 00:03:39.874 CC lib/blob/blob_bs_dev.o 00:03:39.874 CC lib/init/json_config.o 00:03:39.874 CC lib/init/subsystem_rpc.o 00:03:39.874 CC lib/init/subsystem.o 00:03:39.874 CC lib/init/rpc.o 00:03:39.874 CC lib/virtio/virtio.o 00:03:39.874 CC lib/virtio/virtio_vhost_user.o 00:03:39.874 CC lib/virtio/virtio_vfio_user.o 00:03:39.874 CC lib/virtio/virtio_pci.o 00:03:39.874 CC lib/accel/accel_rpc.o 00:03:39.874 CC lib/accel/accel.o 00:03:39.874 CC lib/accel/accel_sw.o 00:03:40.133 LIB libspdk_init.a 00:03:40.133 SO libspdk_init.so.5.0 00:03:40.391 LIB libspdk_virtio.a 00:03:40.392 SYMLINK libspdk_init.so 00:03:40.392 SO libspdk_virtio.so.7.0 00:03:40.392 SYMLINK libspdk_virtio.so 00:03:40.651 CC lib/event/app.o 00:03:40.651 CC lib/event/reactor.o 00:03:40.651 CC lib/event/log_rpc.o 00:03:40.651 CC lib/event/app_rpc.o 00:03:40.651 CC lib/event/scheduler_static.o 00:03:40.651 LIB libspdk_accel.a 00:03:40.651 SO libspdk_accel.so.15.1 00:03:40.910 SYMLINK libspdk_accel.so 00:03:41.169 LIB libspdk_event.a 00:03:41.169 SO libspdk_event.so.14.0 00:03:41.169 CC lib/bdev/bdev.o 00:03:41.169 CC lib/bdev/bdev_zone.o 00:03:41.169 CC lib/bdev/bdev_rpc.o 00:03:41.169 CC lib/bdev/scsi_nvme.o 00:03:41.169 CC lib/bdev/part.o 00:03:41.169 SYMLINK libspdk_event.so 00:03:41.735 LIB libspdk_nvme.a 00:03:41.735 SO libspdk_nvme.so.13.1 00:03:41.993 SYMLINK libspdk_nvme.so 00:03:43.366 LIB libspdk_blob.a 00:03:43.366 SO libspdk_blob.so.11.0 00:03:43.623 SYMLINK libspdk_blob.so 00:03:43.882 LIB libspdk_bdev.a 00:03:43.882 CC lib/blobfs/blobfs.o 00:03:43.882 CC lib/blobfs/tree.o 00:03:43.882 CC lib/lvol/lvol.o 00:03:43.882 SO libspdk_bdev.so.15.1 00:03:43.882 SYMLINK libspdk_bdev.so 00:03:44.458 CC lib/ublk/ublk.o 00:03:44.458 CC lib/ublk/ublk_rpc.o 00:03:44.458 CC lib/ftl/ftl_core.o 00:03:44.458 CC lib/ftl/ftl_layout.o 00:03:44.458 CC lib/ftl/ftl_init.o 00:03:44.459 CC lib/ftl/ftl_debug.o 00:03:44.459 CC lib/nbd/nbd.o 00:03:44.459 CC lib/scsi/dev.o 00:03:44.459 CC lib/ftl/ftl_l2p.o 00:03:44.459 CC lib/ftl/ftl_io.o 00:03:44.459 CC lib/nbd/nbd_rpc.o 00:03:44.459 CC lib/scsi/lun.o 00:03:44.459 CC lib/scsi/scsi.o 00:03:44.459 CC lib/ftl/ftl_sb.o 00:03:44.459 CC lib/scsi/port.o 00:03:44.459 CC lib/nvmf/ctrlr.o 00:03:44.459 CC lib/scsi/scsi_bdev.o 00:03:44.459 CC lib/scsi/scsi_pr.o 00:03:44.459 CC lib/ftl/ftl_l2p_flat.o 00:03:44.459 CC lib/scsi/scsi_rpc.o 00:03:44.459 CC lib/nvmf/ctrlr_discovery.o 00:03:44.459 CC lib/ftl/ftl_nv_cache.o 00:03:44.459 CC lib/ftl/ftl_band.o 00:03:44.459 CC lib/scsi/task.o 00:03:44.459 CC lib/ftl/ftl_band_ops.o 00:03:44.459 CC lib/ftl/ftl_writer.o 00:03:44.459 CC lib/nvmf/ctrlr_bdev.o 00:03:44.459 CC lib/nvmf/subsystem.o 00:03:44.459 CC lib/nvmf/nvmf.o 00:03:44.459 CC lib/ftl/ftl_rq.o 00:03:44.459 CC lib/nvmf/nvmf_rpc.o 00:03:44.459 CC lib/ftl/ftl_reloc.o 00:03:44.459 CC lib/ftl/ftl_l2p_cache.o 00:03:44.459 CC lib/nvmf/transport.o 00:03:44.459 CC lib/ftl/ftl_p2l.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt.o 00:03:44.459 CC lib/nvmf/stubs.o 00:03:44.459 CC lib/nvmf/tcp.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:44.459 CC lib/nvmf/rdma.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:44.459 CC lib/nvmf/mdns_server.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:44.459 CC lib/nvmf/auth.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:44.459 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:44.459 CC lib/ftl/utils/ftl_conf.o 00:03:44.459 CC lib/ftl/utils/ftl_md.o 00:03:44.459 CC lib/ftl/utils/ftl_mempool.o 00:03:44.459 CC lib/ftl/utils/ftl_bitmap.o 00:03:44.459 CC lib/ftl/utils/ftl_property.o 00:03:44.459 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:44.459 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:44.459 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:44.459 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:44.459 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:44.459 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:44.459 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:44.459 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:44.459 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:44.459 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:44.459 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:44.459 CC lib/ftl/base/ftl_base_dev.o 00:03:44.719 CC lib/ftl/base/ftl_base_bdev.o 00:03:44.719 CC lib/ftl/ftl_trace.o 00:03:44.976 LIB libspdk_blobfs.a 00:03:44.976 SO libspdk_blobfs.so.10.0 00:03:44.976 SYMLINK libspdk_blobfs.so 00:03:44.976 LIB libspdk_nbd.a 00:03:44.976 LIB libspdk_lvol.a 00:03:44.976 SO libspdk_lvol.so.10.0 00:03:44.976 SO libspdk_nbd.so.7.0 00:03:45.234 SYMLINK libspdk_nbd.so 00:03:45.234 LIB libspdk_scsi.a 00:03:45.234 LIB libspdk_ublk.a 00:03:45.234 SYMLINK libspdk_lvol.so 00:03:45.234 SO libspdk_scsi.so.9.0 00:03:45.234 SO libspdk_ublk.so.3.0 00:03:45.234 SYMLINK libspdk_ublk.so 00:03:45.234 SYMLINK libspdk_scsi.so 00:03:45.822 LIB libspdk_ftl.a 00:03:45.822 CC lib/iscsi/conn.o 00:03:45.822 CC lib/iscsi/init_grp.o 00:03:45.822 CC lib/iscsi/iscsi.o 00:03:45.822 CC lib/iscsi/param.o 00:03:45.822 CC lib/iscsi/md5.o 00:03:45.822 CC lib/iscsi/tgt_node.o 00:03:45.822 CC lib/iscsi/portal_grp.o 00:03:45.822 CC lib/iscsi/iscsi_subsystem.o 00:03:45.822 CC lib/iscsi/iscsi_rpc.o 00:03:45.822 CC lib/iscsi/task.o 00:03:45.822 CC lib/vhost/vhost.o 00:03:45.822 CC lib/vhost/vhost_rpc.o 00:03:45.822 CC lib/vhost/vhost_scsi.o 00:03:45.822 CC lib/vhost/vhost_blk.o 00:03:45.822 CC lib/vhost/rte_vhost_user.o 00:03:45.822 SO libspdk_ftl.so.9.0 00:03:46.386 SYMLINK libspdk_ftl.so 00:03:46.644 LIB libspdk_nvmf.a 00:03:46.902 LIB libspdk_vhost.a 00:03:46.902 SO libspdk_nvmf.so.18.1 00:03:46.902 SO libspdk_vhost.so.8.0 00:03:46.902 SYMLINK libspdk_vhost.so 00:03:47.158 SYMLINK libspdk_nvmf.so 00:03:47.158 LIB libspdk_iscsi.a 00:03:47.158 SO libspdk_iscsi.so.8.0 00:03:47.416 SYMLINK libspdk_iscsi.so 00:03:47.982 CC module/env_dpdk/env_dpdk_rpc.o 00:03:47.982 CC module/keyring/file/keyring_rpc.o 00:03:47.982 CC module/keyring/file/keyring.o 00:03:47.982 CC module/accel/ioat/accel_ioat.o 00:03:47.982 CC module/accel/ioat/accel_ioat_rpc.o 00:03:47.982 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:47.982 CC module/sock/posix/posix.o 00:03:47.982 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:47.982 CC module/accel/error/accel_error_rpc.o 00:03:47.982 CC module/accel/error/accel_error.o 00:03:47.982 CC module/accel/iaa/accel_iaa.o 00:03:47.982 CC module/accel/iaa/accel_iaa_rpc.o 00:03:47.982 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:03:47.982 CC module/keyring/linux/keyring.o 00:03:47.982 LIB libspdk_env_dpdk_rpc.a 00:03:47.982 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:03:47.982 CC module/keyring/linux/keyring_rpc.o 00:03:47.982 CC module/blob/bdev/blob_bdev.o 00:03:47.982 CC module/accel/dsa/accel_dsa.o 00:03:47.982 CC module/scheduler/gscheduler/gscheduler.o 00:03:47.982 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:03:47.982 CC module/accel/dsa/accel_dsa_rpc.o 00:03:47.982 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:03:47.982 SO libspdk_env_dpdk_rpc.so.6.0 00:03:48.239 SYMLINK libspdk_env_dpdk_rpc.so 00:03:48.239 LIB libspdk_keyring_file.a 00:03:48.239 LIB libspdk_scheduler_dpdk_governor.a 00:03:48.239 LIB libspdk_keyring_linux.a 00:03:48.239 LIB libspdk_accel_error.a 00:03:48.239 LIB libspdk_scheduler_gscheduler.a 00:03:48.239 SO libspdk_scheduler_dpdk_governor.so.4.0 00:03:48.239 LIB libspdk_accel_ioat.a 00:03:48.239 LIB libspdk_scheduler_dynamic.a 00:03:48.239 SO libspdk_keyring_file.so.1.0 00:03:48.239 SO libspdk_keyring_linux.so.1.0 00:03:48.239 SO libspdk_scheduler_gscheduler.so.4.0 00:03:48.239 SO libspdk_accel_error.so.2.0 00:03:48.239 LIB libspdk_accel_iaa.a 00:03:48.239 SO libspdk_scheduler_dynamic.so.4.0 00:03:48.239 SO libspdk_accel_ioat.so.6.0 00:03:48.239 LIB libspdk_blob_bdev.a 00:03:48.239 SYMLINK libspdk_keyring_file.so 00:03:48.239 SYMLINK libspdk_scheduler_dpdk_governor.so 00:03:48.239 SO libspdk_accel_iaa.so.3.0 00:03:48.239 SYMLINK libspdk_keyring_linux.so 00:03:48.239 SYMLINK libspdk_scheduler_gscheduler.so 00:03:48.497 LIB libspdk_accel_dsa.a 00:03:48.497 SO libspdk_blob_bdev.so.11.0 00:03:48.497 SYMLINK libspdk_scheduler_dynamic.so 00:03:48.497 SYMLINK libspdk_accel_ioat.so 00:03:48.497 SO libspdk_accel_dsa.so.5.0 00:03:48.497 SYMLINK libspdk_accel_iaa.so 00:03:48.497 SYMLINK libspdk_accel_error.so 00:03:48.497 SYMLINK libspdk_blob_bdev.so 00:03:48.497 SYMLINK libspdk_accel_dsa.so 00:03:48.754 LIB libspdk_sock_posix.a 00:03:48.754 SO libspdk_sock_posix.so.6.0 00:03:49.012 CC module/bdev/delay/vbdev_delay.o 00:03:49.012 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:49.012 CC module/bdev/null/bdev_null.o 00:03:49.012 CC module/blobfs/bdev/blobfs_bdev.o 00:03:49.012 CC module/bdev/null/bdev_null_rpc.o 00:03:49.012 CC module/bdev/lvol/vbdev_lvol.o 00:03:49.012 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:49.012 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:49.012 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:49.012 CC module/bdev/ftl/bdev_ftl.o 00:03:49.012 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:49.012 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:49.012 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:49.012 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:49.012 CC module/bdev/compress/vbdev_compress.o 00:03:49.012 CC module/bdev/malloc/bdev_malloc.o 00:03:49.012 CC module/bdev/compress/vbdev_compress_rpc.o 00:03:49.012 CC module/bdev/crypto/vbdev_crypto.o 00:03:49.012 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:03:49.012 CC module/bdev/raid/bdev_raid.o 00:03:49.012 CC module/bdev/nvme/bdev_nvme.o 00:03:49.012 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:49.012 CC module/bdev/raid/bdev_raid_rpc.o 00:03:49.012 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:49.012 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:49.012 CC module/bdev/raid/bdev_raid_sb.o 00:03:49.012 CC module/bdev/nvme/nvme_rpc.o 00:03:49.012 CC module/bdev/nvme/bdev_mdns_client.o 00:03:49.012 CC module/bdev/raid/raid0.o 00:03:49.012 CC module/bdev/nvme/vbdev_opal.o 00:03:49.012 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:49.012 CC module/bdev/raid/raid1.o 00:03:49.012 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:49.012 CC module/bdev/gpt/gpt.o 00:03:49.012 CC module/bdev/raid/concat.o 00:03:49.012 CC module/bdev/gpt/vbdev_gpt.o 00:03:49.012 CC module/bdev/iscsi/bdev_iscsi.o 00:03:49.012 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:49.012 CC module/bdev/passthru/vbdev_passthru.o 00:03:49.012 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:49.012 CC module/bdev/error/vbdev_error.o 00:03:49.012 CC module/bdev/error/vbdev_error_rpc.o 00:03:49.012 CC module/bdev/aio/bdev_aio.o 00:03:49.012 CC module/bdev/aio/bdev_aio_rpc.o 00:03:49.012 CC module/bdev/split/vbdev_split.o 00:03:49.012 CC module/bdev/split/vbdev_split_rpc.o 00:03:49.012 SYMLINK libspdk_sock_posix.so 00:03:49.270 LIB libspdk_bdev_null.a 00:03:49.270 SO libspdk_bdev_null.so.6.0 00:03:49.270 LIB libspdk_blobfs_bdev.a 00:03:49.270 LIB libspdk_accel_dpdk_compressdev.a 00:03:49.270 SO libspdk_blobfs_bdev.so.6.0 00:03:49.270 LIB libspdk_bdev_gpt.a 00:03:49.270 SYMLINK libspdk_bdev_null.so 00:03:49.270 SO libspdk_accel_dpdk_compressdev.so.3.0 00:03:49.270 SO libspdk_bdev_gpt.so.6.0 00:03:49.270 LIB libspdk_bdev_crypto.a 00:03:49.270 LIB libspdk_bdev_zone_block.a 00:03:49.270 SYMLINK libspdk_blobfs_bdev.so 00:03:49.270 LIB libspdk_bdev_split.a 00:03:49.270 LIB libspdk_bdev_error.a 00:03:49.270 SYMLINK libspdk_accel_dpdk_compressdev.so 00:03:49.270 LIB libspdk_bdev_passthru.a 00:03:49.270 LIB libspdk_bdev_malloc.a 00:03:49.270 SO libspdk_bdev_crypto.so.6.0 00:03:49.270 SO libspdk_bdev_split.so.6.0 00:03:49.270 SYMLINK libspdk_bdev_gpt.so 00:03:49.526 SO libspdk_bdev_zone_block.so.6.0 00:03:49.526 SO libspdk_bdev_error.so.6.0 00:03:49.526 SO libspdk_bdev_passthru.so.6.0 00:03:49.526 LIB libspdk_bdev_aio.a 00:03:49.526 SO libspdk_bdev_malloc.so.6.0 00:03:49.526 LIB libspdk_bdev_iscsi.a 00:03:49.526 LIB libspdk_bdev_ftl.a 00:03:49.526 SYMLINK libspdk_bdev_split.so 00:03:49.526 SYMLINK libspdk_bdev_crypto.so 00:03:49.526 SO libspdk_bdev_aio.so.6.0 00:03:49.526 LIB libspdk_accel_dpdk_cryptodev.a 00:03:49.526 SYMLINK libspdk_bdev_error.so 00:03:49.526 SO libspdk_bdev_iscsi.so.6.0 00:03:49.526 LIB libspdk_bdev_compress.a 00:03:49.526 SO libspdk_bdev_ftl.so.6.0 00:03:49.526 SYMLINK libspdk_bdev_zone_block.so 00:03:49.526 SYMLINK libspdk_bdev_passthru.so 00:03:49.526 LIB libspdk_bdev_delay.a 00:03:49.526 SYMLINK libspdk_bdev_malloc.so 00:03:49.526 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:03:49.526 SO libspdk_bdev_compress.so.6.0 00:03:49.526 SO libspdk_bdev_delay.so.6.0 00:03:49.526 SYMLINK libspdk_bdev_aio.so 00:03:49.526 SYMLINK libspdk_bdev_ftl.so 00:03:49.526 SYMLINK libspdk_bdev_iscsi.so 00:03:49.526 SYMLINK libspdk_bdev_delay.so 00:03:49.526 SYMLINK libspdk_bdev_compress.so 00:03:49.526 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:03:49.526 LIB libspdk_bdev_lvol.a 00:03:49.526 LIB libspdk_bdev_virtio.a 00:03:49.526 SO libspdk_bdev_lvol.so.6.0 00:03:49.782 SO libspdk_bdev_virtio.so.6.0 00:03:49.782 SYMLINK libspdk_bdev_lvol.so 00:03:49.782 SYMLINK libspdk_bdev_virtio.so 00:03:50.040 LIB libspdk_bdev_raid.a 00:03:50.040 SO libspdk_bdev_raid.so.6.0 00:03:50.040 SYMLINK libspdk_bdev_raid.so 00:03:51.415 LIB libspdk_bdev_nvme.a 00:03:51.415 SO libspdk_bdev_nvme.so.7.0 00:03:51.415 SYMLINK libspdk_bdev_nvme.so 00:03:51.981 CC module/event/subsystems/keyring/keyring.o 00:03:51.981 CC module/event/subsystems/iobuf/iobuf.o 00:03:51.981 CC module/event/subsystems/vmd/vmd.o 00:03:51.981 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:51.981 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:51.981 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:52.239 CC module/event/subsystems/sock/sock.o 00:03:52.239 CC module/event/subsystems/scheduler/scheduler.o 00:03:52.239 LIB libspdk_event_keyring.a 00:03:52.239 LIB libspdk_event_sock.a 00:03:52.239 SO libspdk_event_keyring.so.1.0 00:03:52.239 LIB libspdk_event_vhost_blk.a 00:03:52.239 LIB libspdk_event_iobuf.a 00:03:52.239 LIB libspdk_event_vmd.a 00:03:52.239 LIB libspdk_event_scheduler.a 00:03:52.239 SO libspdk_event_sock.so.5.0 00:03:52.239 SO libspdk_event_vhost_blk.so.3.0 00:03:52.239 SO libspdk_event_iobuf.so.3.0 00:03:52.239 SO libspdk_event_scheduler.so.4.0 00:03:52.239 SO libspdk_event_vmd.so.6.0 00:03:52.239 SYMLINK libspdk_event_keyring.so 00:03:52.498 SYMLINK libspdk_event_sock.so 00:03:52.498 SYMLINK libspdk_event_vhost_blk.so 00:03:52.498 SYMLINK libspdk_event_scheduler.so 00:03:52.498 SYMLINK libspdk_event_iobuf.so 00:03:52.498 SYMLINK libspdk_event_vmd.so 00:03:52.755 CC module/event/subsystems/accel/accel.o 00:03:53.013 LIB libspdk_event_accel.a 00:03:53.013 SO libspdk_event_accel.so.6.0 00:03:53.013 SYMLINK libspdk_event_accel.so 00:03:53.271 CC module/event/subsystems/bdev/bdev.o 00:03:53.548 LIB libspdk_event_bdev.a 00:03:53.548 SO libspdk_event_bdev.so.6.0 00:03:53.852 SYMLINK libspdk_event_bdev.so 00:03:54.111 CC module/event/subsystems/nbd/nbd.o 00:03:54.111 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:54.111 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:54.111 CC module/event/subsystems/scsi/scsi.o 00:03:54.111 CC module/event/subsystems/ublk/ublk.o 00:03:54.111 LIB libspdk_event_nbd.a 00:03:54.111 LIB libspdk_event_ublk.a 00:03:54.111 SO libspdk_event_nbd.so.6.0 00:03:54.111 LIB libspdk_event_scsi.a 00:03:54.111 SO libspdk_event_ublk.so.3.0 00:03:54.369 SO libspdk_event_scsi.so.6.0 00:03:54.369 LIB libspdk_event_nvmf.a 00:03:54.369 SYMLINK libspdk_event_nbd.so 00:03:54.369 SO libspdk_event_nvmf.so.6.0 00:03:54.369 SYMLINK libspdk_event_scsi.so 00:03:54.369 SYMLINK libspdk_event_ublk.so 00:03:54.369 SYMLINK libspdk_event_nvmf.so 00:03:54.627 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:54.627 CC module/event/subsystems/iscsi/iscsi.o 00:03:54.885 LIB libspdk_event_vhost_scsi.a 00:03:54.885 SO libspdk_event_vhost_scsi.so.3.0 00:03:54.885 LIB libspdk_event_iscsi.a 00:03:54.885 SYMLINK libspdk_event_vhost_scsi.so 00:03:54.885 SO libspdk_event_iscsi.so.6.0 00:03:54.885 SYMLINK libspdk_event_iscsi.so 00:03:55.143 SO libspdk.so.6.0 00:03:55.143 SYMLINK libspdk.so 00:03:55.720 CC app/spdk_nvme_discover/discovery_aer.o 00:03:55.720 TEST_HEADER include/spdk/accel.h 00:03:55.720 TEST_HEADER include/spdk/accel_module.h 00:03:55.720 CC app/trace_record/trace_record.o 00:03:55.720 TEST_HEADER include/spdk/assert.h 00:03:55.720 TEST_HEADER include/spdk/barrier.h 00:03:55.720 TEST_HEADER include/spdk/base64.h 00:03:55.720 CC app/spdk_top/spdk_top.o 00:03:55.720 CC app/spdk_nvme_identify/identify.o 00:03:55.720 CC app/spdk_lspci/spdk_lspci.o 00:03:55.720 TEST_HEADER include/spdk/bdev.h 00:03:55.720 TEST_HEADER include/spdk/bdev_module.h 00:03:55.720 TEST_HEADER include/spdk/bdev_zone.h 00:03:55.720 CC test/rpc_client/rpc_client_test.o 00:03:55.720 TEST_HEADER include/spdk/bit_array.h 00:03:55.720 TEST_HEADER include/spdk/bit_pool.h 00:03:55.720 CC app/spdk_nvme_perf/perf.o 00:03:55.720 TEST_HEADER include/spdk/blob_bdev.h 00:03:55.720 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:55.720 TEST_HEADER include/spdk/blobfs.h 00:03:55.720 TEST_HEADER include/spdk/blob.h 00:03:55.720 CXX app/trace/trace.o 00:03:55.720 TEST_HEADER include/spdk/config.h 00:03:55.720 TEST_HEADER include/spdk/conf.h 00:03:55.720 TEST_HEADER include/spdk/cpuset.h 00:03:55.720 TEST_HEADER include/spdk/crc64.h 00:03:55.720 TEST_HEADER include/spdk/crc16.h 00:03:55.720 TEST_HEADER include/spdk/dif.h 00:03:55.720 TEST_HEADER include/spdk/crc32.h 00:03:55.720 TEST_HEADER include/spdk/endian.h 00:03:55.720 TEST_HEADER include/spdk/dma.h 00:03:55.720 TEST_HEADER include/spdk/env_dpdk.h 00:03:55.720 TEST_HEADER include/spdk/env.h 00:03:55.720 TEST_HEADER include/spdk/event.h 00:03:55.720 TEST_HEADER include/spdk/fd_group.h 00:03:55.720 TEST_HEADER include/spdk/fd.h 00:03:55.720 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:55.720 TEST_HEADER include/spdk/file.h 00:03:55.720 TEST_HEADER include/spdk/ftl.h 00:03:55.720 TEST_HEADER include/spdk/gpt_spec.h 00:03:55.720 TEST_HEADER include/spdk/histogram_data.h 00:03:55.720 TEST_HEADER include/spdk/hexlify.h 00:03:55.720 TEST_HEADER include/spdk/idxd.h 00:03:55.720 TEST_HEADER include/spdk/idxd_spec.h 00:03:55.720 TEST_HEADER include/spdk/init.h 00:03:55.720 TEST_HEADER include/spdk/ioat.h 00:03:55.720 TEST_HEADER include/spdk/ioat_spec.h 00:03:55.720 TEST_HEADER include/spdk/iscsi_spec.h 00:03:55.720 TEST_HEADER include/spdk/json.h 00:03:55.720 TEST_HEADER include/spdk/jsonrpc.h 00:03:55.720 TEST_HEADER include/spdk/keyring.h 00:03:55.720 TEST_HEADER include/spdk/likely.h 00:03:55.720 TEST_HEADER include/spdk/keyring_module.h 00:03:55.720 TEST_HEADER include/spdk/log.h 00:03:55.720 TEST_HEADER include/spdk/lvol.h 00:03:55.720 TEST_HEADER include/spdk/memory.h 00:03:55.720 TEST_HEADER include/spdk/mmio.h 00:03:55.720 TEST_HEADER include/spdk/notify.h 00:03:55.720 TEST_HEADER include/spdk/nbd.h 00:03:55.720 CC app/spdk_dd/spdk_dd.o 00:03:55.720 TEST_HEADER include/spdk/nvme.h 00:03:55.720 TEST_HEADER include/spdk/nvme_intel.h 00:03:55.720 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:55.720 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:55.720 CC app/iscsi_tgt/iscsi_tgt.o 00:03:55.720 TEST_HEADER include/spdk/nvme_spec.h 00:03:55.720 TEST_HEADER include/spdk/nvme_zns.h 00:03:55.720 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:55.720 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:55.720 TEST_HEADER include/spdk/nvmf.h 00:03:55.720 TEST_HEADER include/spdk/nvmf_spec.h 00:03:55.720 TEST_HEADER include/spdk/nvmf_transport.h 00:03:55.720 TEST_HEADER include/spdk/opal.h 00:03:55.720 TEST_HEADER include/spdk/pci_ids.h 00:03:55.720 TEST_HEADER include/spdk/opal_spec.h 00:03:55.720 TEST_HEADER include/spdk/pipe.h 00:03:55.720 TEST_HEADER include/spdk/queue.h 00:03:55.720 TEST_HEADER include/spdk/reduce.h 00:03:55.720 TEST_HEADER include/spdk/rpc.h 00:03:55.720 TEST_HEADER include/spdk/scheduler.h 00:03:55.720 TEST_HEADER include/spdk/scsi.h 00:03:55.720 TEST_HEADER include/spdk/scsi_spec.h 00:03:55.720 TEST_HEADER include/spdk/sock.h 00:03:55.720 TEST_HEADER include/spdk/stdinc.h 00:03:55.720 TEST_HEADER include/spdk/string.h 00:03:55.720 TEST_HEADER include/spdk/thread.h 00:03:55.720 TEST_HEADER include/spdk/trace.h 00:03:55.720 TEST_HEADER include/spdk/trace_parser.h 00:03:55.720 TEST_HEADER include/spdk/tree.h 00:03:55.720 TEST_HEADER include/spdk/ublk.h 00:03:55.720 TEST_HEADER include/spdk/util.h 00:03:55.720 TEST_HEADER include/spdk/uuid.h 00:03:55.720 TEST_HEADER include/spdk/version.h 00:03:55.720 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:55.720 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:55.720 TEST_HEADER include/spdk/vhost.h 00:03:55.720 TEST_HEADER include/spdk/vmd.h 00:03:55.720 TEST_HEADER include/spdk/xor.h 00:03:55.720 TEST_HEADER include/spdk/zipf.h 00:03:55.720 CXX test/cpp_headers/accel.o 00:03:55.720 CXX test/cpp_headers/accel_module.o 00:03:55.720 CXX test/cpp_headers/assert.o 00:03:55.720 CXX test/cpp_headers/barrier.o 00:03:55.720 CXX test/cpp_headers/base64.o 00:03:55.720 CXX test/cpp_headers/bdev_module.o 00:03:55.720 CXX test/cpp_headers/bdev.o 00:03:55.720 CXX test/cpp_headers/bdev_zone.o 00:03:55.720 CXX test/cpp_headers/bit_array.o 00:03:55.720 CXX test/cpp_headers/blob_bdev.o 00:03:55.720 CXX test/cpp_headers/bit_pool.o 00:03:55.720 CXX test/cpp_headers/blobfs_bdev.o 00:03:55.720 CXX test/cpp_headers/blobfs.o 00:03:55.720 CXX test/cpp_headers/blob.o 00:03:55.720 CXX test/cpp_headers/conf.o 00:03:55.720 CXX test/cpp_headers/config.o 00:03:55.720 CXX test/cpp_headers/crc16.o 00:03:55.720 CXX test/cpp_headers/cpuset.o 00:03:55.720 CC app/spdk_tgt/spdk_tgt.o 00:03:55.720 CXX test/cpp_headers/crc32.o 00:03:55.720 CXX test/cpp_headers/crc64.o 00:03:55.720 CXX test/cpp_headers/dif.o 00:03:55.720 CXX test/cpp_headers/endian.o 00:03:55.720 CXX test/cpp_headers/dma.o 00:03:55.720 CXX test/cpp_headers/env.o 00:03:55.720 CXX test/cpp_headers/event.o 00:03:55.720 CXX test/cpp_headers/fd_group.o 00:03:55.720 CXX test/cpp_headers/fd.o 00:03:55.720 CXX test/cpp_headers/env_dpdk.o 00:03:55.720 CXX test/cpp_headers/file.o 00:03:55.720 CXX test/cpp_headers/ftl.o 00:03:55.720 CXX test/cpp_headers/gpt_spec.o 00:03:55.720 CXX test/cpp_headers/hexlify.o 00:03:55.720 CXX test/cpp_headers/histogram_data.o 00:03:55.720 CXX test/cpp_headers/idxd.o 00:03:55.720 CXX test/cpp_headers/ioat.o 00:03:55.720 CXX test/cpp_headers/init.o 00:03:55.720 CXX test/cpp_headers/idxd_spec.o 00:03:55.720 CXX test/cpp_headers/ioat_spec.o 00:03:55.720 CXX test/cpp_headers/iscsi_spec.o 00:03:55.720 CXX test/cpp_headers/json.o 00:03:55.720 CXX test/cpp_headers/jsonrpc.o 00:03:55.720 CXX test/cpp_headers/keyring.o 00:03:55.720 CC app/nvmf_tgt/nvmf_main.o 00:03:55.720 CC examples/ioat/perf/perf.o 00:03:55.720 CXX test/cpp_headers/keyring_module.o 00:03:55.720 CC examples/ioat/verify/verify.o 00:03:55.720 CC test/env/pci/pci_ut.o 00:03:55.720 CC test/thread/poller_perf/poller_perf.o 00:03:55.720 CC examples/util/zipf/zipf.o 00:03:55.720 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:55.720 CC test/env/memory/memory_ut.o 00:03:55.720 CC app/fio/nvme/fio_plugin.o 00:03:55.720 CC test/app/jsoncat/jsoncat.o 00:03:55.720 CC test/app/histogram_perf/histogram_perf.o 00:03:55.720 CC test/env/vtophys/vtophys.o 00:03:55.983 CC test/dma/test_dma/test_dma.o 00:03:55.983 CC app/fio/bdev/fio_plugin.o 00:03:55.983 CC test/app/stub/stub.o 00:03:55.983 CC test/app/bdev_svc/bdev_svc.o 00:03:55.983 LINK spdk_lspci 00:03:55.983 LINK rpc_client_test 00:03:55.983 LINK spdk_nvme_discover 00:03:55.983 LINK interrupt_tgt 00:03:56.242 LINK spdk_trace_record 00:03:56.242 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:56.242 CC test/env/mem_callbacks/mem_callbacks.o 00:03:56.242 LINK iscsi_tgt 00:03:56.242 CXX test/cpp_headers/likely.o 00:03:56.242 CXX test/cpp_headers/log.o 00:03:56.242 CXX test/cpp_headers/lvol.o 00:03:56.242 LINK jsoncat 00:03:56.242 CXX test/cpp_headers/memory.o 00:03:56.242 CXX test/cpp_headers/mmio.o 00:03:56.242 CXX test/cpp_headers/nbd.o 00:03:56.242 LINK poller_perf 00:03:56.242 CXX test/cpp_headers/notify.o 00:03:56.242 CXX test/cpp_headers/nvme.o 00:03:56.242 CXX test/cpp_headers/nvme_intel.o 00:03:56.242 CXX test/cpp_headers/nvme_ocssd.o 00:03:56.242 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:56.242 LINK vtophys 00:03:56.242 CXX test/cpp_headers/nvme_spec.o 00:03:56.242 LINK env_dpdk_post_init 00:03:56.242 CXX test/cpp_headers/nvme_zns.o 00:03:56.242 CXX test/cpp_headers/nvmf_cmd.o 00:03:56.242 LINK zipf 00:03:56.242 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:56.242 LINK spdk_tgt 00:03:56.242 CXX test/cpp_headers/nvmf.o 00:03:56.242 CXX test/cpp_headers/nvmf_spec.o 00:03:56.242 CXX test/cpp_headers/nvmf_transport.o 00:03:56.242 CXX test/cpp_headers/opal.o 00:03:56.242 LINK nvmf_tgt 00:03:56.242 CXX test/cpp_headers/opal_spec.o 00:03:56.242 CXX test/cpp_headers/pci_ids.o 00:03:56.242 CXX test/cpp_headers/pipe.o 00:03:56.242 CXX test/cpp_headers/queue.o 00:03:56.242 CXX test/cpp_headers/reduce.o 00:03:56.242 LINK histogram_perf 00:03:56.242 LINK ioat_perf 00:03:56.242 CXX test/cpp_headers/rpc.o 00:03:56.242 CXX test/cpp_headers/scheduler.o 00:03:56.242 CXX test/cpp_headers/scsi.o 00:03:56.242 CXX test/cpp_headers/scsi_spec.o 00:03:56.242 CXX test/cpp_headers/sock.o 00:03:56.242 CXX test/cpp_headers/stdinc.o 00:03:56.242 CXX test/cpp_headers/string.o 00:03:56.242 CXX test/cpp_headers/thread.o 00:03:56.242 CXX test/cpp_headers/trace.o 00:03:56.242 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:56.242 CXX test/cpp_headers/trace_parser.o 00:03:56.242 CXX test/cpp_headers/tree.o 00:03:56.242 CXX test/cpp_headers/ublk.o 00:03:56.242 CXX test/cpp_headers/util.o 00:03:56.242 CXX test/cpp_headers/uuid.o 00:03:56.242 CXX test/cpp_headers/version.o 00:03:56.242 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:56.242 LINK bdev_svc 00:03:56.242 CXX test/cpp_headers/vfio_user_pci.o 00:03:56.502 CXX test/cpp_headers/vfio_user_spec.o 00:03:56.502 LINK verify 00:03:56.502 CXX test/cpp_headers/vhost.o 00:03:56.502 CXX test/cpp_headers/vmd.o 00:03:56.502 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:56.502 CXX test/cpp_headers/xor.o 00:03:56.502 CXX test/cpp_headers/zipf.o 00:03:56.502 LINK spdk_dd 00:03:56.760 LINK pci_ut 00:03:56.760 LINK spdk_trace 00:03:56.760 LINK test_dma 00:03:56.760 LINK stub 00:03:56.760 CC test/event/reactor/reactor.o 00:03:56.760 CC test/event/event_perf/event_perf.o 00:03:56.760 CC test/event/reactor_perf/reactor_perf.o 00:03:56.760 CC test/event/app_repeat/app_repeat.o 00:03:56.760 CC test/event/scheduler/scheduler.o 00:03:56.760 LINK spdk_nvme 00:03:57.019 LINK nvme_fuzz 00:03:57.019 LINK spdk_bdev 00:03:57.019 CC examples/idxd/perf/perf.o 00:03:57.019 CC examples/sock/hello_world/hello_sock.o 00:03:57.019 CC examples/vmd/led/led.o 00:03:57.019 CC examples/vmd/lsvmd/lsvmd.o 00:03:57.019 CC examples/thread/thread/thread_ex.o 00:03:57.019 LINK mem_callbacks 00:03:57.019 LINK reactor 00:03:57.019 LINK spdk_nvme_perf 00:03:57.019 LINK reactor_perf 00:03:57.019 LINK app_repeat 00:03:57.019 LINK vhost_fuzz 00:03:57.019 LINK event_perf 00:03:57.019 LINK spdk_nvme_identify 00:03:57.019 LINK led 00:03:57.019 LINK lsvmd 00:03:57.019 LINK scheduler 00:03:57.019 LINK spdk_top 00:03:57.277 LINK hello_sock 00:03:57.277 CC app/vhost/vhost.o 00:03:57.277 LINK thread 00:03:57.277 LINK idxd_perf 00:03:57.277 CC test/nvme/simple_copy/simple_copy.o 00:03:57.277 CC test/nvme/compliance/nvme_compliance.o 00:03:57.277 CC test/nvme/aer/aer.o 00:03:57.277 CC test/nvme/reserve/reserve.o 00:03:57.277 CC test/nvme/e2edp/nvme_dp.o 00:03:57.277 CC test/nvme/reset/reset.o 00:03:57.277 CC test/nvme/startup/startup.o 00:03:57.277 CC test/nvme/overhead/overhead.o 00:03:57.277 CC test/nvme/sgl/sgl.o 00:03:57.277 CC test/nvme/fused_ordering/fused_ordering.o 00:03:57.277 CC test/nvme/cuse/cuse.o 00:03:57.277 CC test/nvme/connect_stress/connect_stress.o 00:03:57.277 CC test/nvme/fdp/fdp.o 00:03:57.277 CC test/nvme/boot_partition/boot_partition.o 00:03:57.277 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:57.277 CC test/nvme/err_injection/err_injection.o 00:03:57.277 CC test/blobfs/mkfs/mkfs.o 00:03:57.277 CC test/accel/dif/dif.o 00:03:57.535 CC test/lvol/esnap/esnap.o 00:03:57.535 LINK vhost 00:03:57.535 LINK memory_ut 00:03:57.535 LINK boot_partition 00:03:57.535 LINK doorbell_aers 00:03:57.535 LINK startup 00:03:57.535 LINK connect_stress 00:03:57.535 LINK fused_ordering 00:03:57.535 LINK err_injection 00:03:57.535 LINK mkfs 00:03:57.535 LINK simple_copy 00:03:57.535 LINK reserve 00:03:57.535 LINK nvme_dp 00:03:57.535 LINK sgl 00:03:57.535 LINK aer 00:03:57.535 LINK overhead 00:03:57.535 LINK reset 00:03:57.535 LINK nvme_compliance 00:03:57.793 CC examples/nvme/reconnect/reconnect.o 00:03:57.793 CC examples/nvme/hotplug/hotplug.o 00:03:57.793 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:57.793 CC examples/nvme/arbitration/arbitration.o 00:03:57.793 LINK fdp 00:03:57.793 CC examples/nvme/abort/abort.o 00:03:57.793 CC examples/nvme/hello_world/hello_world.o 00:03:57.793 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:57.793 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:57.793 CC examples/accel/perf/accel_perf.o 00:03:57.793 LINK dif 00:03:57.793 CC examples/blob/hello_world/hello_blob.o 00:03:57.793 CC examples/blob/cli/blobcli.o 00:03:57.793 LINK hello_world 00:03:57.793 LINK pmr_persistence 00:03:58.051 LINK cmb_copy 00:03:58.051 LINK abort 00:03:58.051 LINK hotplug 00:03:58.051 LINK arbitration 00:03:58.051 LINK reconnect 00:03:58.051 LINK hello_blob 00:03:58.310 LINK nvme_manage 00:03:58.310 LINK accel_perf 00:03:58.310 LINK iscsi_fuzz 00:03:58.310 LINK blobcli 00:03:58.569 CC test/bdev/bdevio/bdevio.o 00:03:58.569 LINK cuse 00:03:58.828 LINK bdevio 00:03:58.828 CC examples/bdev/bdevperf/bdevperf.o 00:03:58.828 CC examples/bdev/hello_world/hello_bdev.o 00:03:59.396 LINK hello_bdev 00:03:59.655 LINK bdevperf 00:04:00.591 CC examples/nvmf/nvmf/nvmf.o 00:04:00.850 LINK nvmf 00:04:02.759 LINK esnap 00:04:03.019 00:04:03.019 real 1m31.199s 00:04:03.019 user 17m19.810s 00:04:03.019 sys 4m13.126s 00:04:03.019 10:11:39 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:04:03.019 10:11:39 make -- common/autotest_common.sh@10 -- $ set +x 00:04:03.019 ************************************ 00:04:03.019 END TEST make 00:04:03.019 ************************************ 00:04:03.019 10:11:40 -- common/autotest_common.sh@1142 -- $ return 0 00:04:03.019 10:11:40 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:03.019 10:11:40 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:03.019 10:11:40 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:03.019 10:11:40 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.019 10:11:40 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:03.019 10:11:40 -- pm/common@44 -- $ pid=318017 00:04:03.019 10:11:40 -- pm/common@50 -- $ kill -TERM 318017 00:04:03.019 10:11:40 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.019 10:11:40 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:03.019 10:11:40 -- pm/common@44 -- $ pid=318019 00:04:03.019 10:11:40 -- pm/common@50 -- $ kill -TERM 318019 00:04:03.019 10:11:40 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.019 10:11:40 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:03.019 10:11:40 -- pm/common@44 -- $ pid=318021 00:04:03.019 10:11:40 -- pm/common@50 -- $ kill -TERM 318021 00:04:03.019 10:11:40 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.019 10:11:40 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:03.019 10:11:40 -- pm/common@44 -- $ pid=318046 00:04:03.019 10:11:40 -- pm/common@50 -- $ sudo -E kill -TERM 318046 00:04:03.019 10:11:40 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:04:03.019 10:11:40 -- nvmf/common.sh@7 -- # uname -s 00:04:03.019 10:11:40 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:03.019 10:11:40 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:03.019 10:11:40 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:03.019 10:11:40 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:03.019 10:11:40 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:03.019 10:11:40 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:03.019 10:11:40 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:03.019 10:11:40 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:03.019 10:11:40 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:03.019 10:11:40 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:03.019 10:11:40 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:04:03.019 10:11:40 -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:04:03.019 10:11:40 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:03.019 10:11:40 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:03.019 10:11:40 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:03.019 10:11:40 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:03.019 10:11:40 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:04:03.019 10:11:40 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:03.019 10:11:40 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:03.019 10:11:40 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:03.019 10:11:40 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:03.019 10:11:40 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:03.019 10:11:40 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:03.019 10:11:40 -- paths/export.sh@5 -- # export PATH 00:04:03.019 10:11:40 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:03.019 10:11:40 -- nvmf/common.sh@47 -- # : 0 00:04:03.019 10:11:40 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:04:03.019 10:11:40 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:04:03.019 10:11:40 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:03.019 10:11:40 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:03.019 10:11:40 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:03.019 10:11:40 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:04:03.019 10:11:40 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:04:03.019 10:11:40 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:04:03.019 10:11:40 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:03.019 10:11:40 -- spdk/autotest.sh@32 -- # uname -s 00:04:03.019 10:11:40 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:03.019 10:11:40 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:03.019 10:11:40 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:03.279 10:11:40 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:03.279 10:11:40 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:03.279 10:11:40 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:03.279 10:11:40 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:03.279 10:11:40 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:03.279 10:11:40 -- spdk/autotest.sh@48 -- # udevadm_pid=384344 00:04:03.279 10:11:40 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:03.279 10:11:40 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:03.279 10:11:40 -- pm/common@17 -- # local monitor 00:04:03.279 10:11:40 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.279 10:11:40 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.279 10:11:40 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.279 10:11:40 -- pm/common@21 -- # date +%s 00:04:03.279 10:11:40 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:03.279 10:11:40 -- pm/common@21 -- # date +%s 00:04:03.279 10:11:40 -- pm/common@25 -- # sleep 1 00:04:03.279 10:11:40 -- pm/common@21 -- # date +%s 00:04:03.279 10:11:40 -- pm/common@21 -- # date +%s 00:04:03.279 10:11:40 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721031100 00:04:03.279 10:11:40 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721031100 00:04:03.279 10:11:40 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721031100 00:04:03.279 10:11:40 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721031100 00:04:03.279 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721031100_collect-vmstat.pm.log 00:04:03.279 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721031100_collect-cpu-load.pm.log 00:04:03.279 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721031100_collect-cpu-temp.pm.log 00:04:03.279 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721031100_collect-bmc-pm.bmc.pm.log 00:04:04.216 10:11:41 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:04.216 10:11:41 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:04.216 10:11:41 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:04.216 10:11:41 -- common/autotest_common.sh@10 -- # set +x 00:04:04.216 10:11:41 -- spdk/autotest.sh@59 -- # create_test_list 00:04:04.216 10:11:41 -- common/autotest_common.sh@746 -- # xtrace_disable 00:04:04.216 10:11:41 -- common/autotest_common.sh@10 -- # set +x 00:04:04.216 10:11:41 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:04:04.216 10:11:41 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:04.216 10:11:41 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:04.216 10:11:41 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:04:04.216 10:11:41 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:04.216 10:11:41 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:04.216 10:11:41 -- common/autotest_common.sh@1455 -- # uname 00:04:04.216 10:11:41 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:04:04.216 10:11:41 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:04.216 10:11:41 -- common/autotest_common.sh@1475 -- # uname 00:04:04.216 10:11:41 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:04:04.216 10:11:41 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:04:04.216 10:11:41 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:04:04.216 10:11:41 -- spdk/autotest.sh@72 -- # hash lcov 00:04:04.216 10:11:41 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:04:04.216 10:11:41 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:04:04.216 --rc lcov_branch_coverage=1 00:04:04.216 --rc lcov_function_coverage=1 00:04:04.216 --rc genhtml_branch_coverage=1 00:04:04.216 --rc genhtml_function_coverage=1 00:04:04.216 --rc genhtml_legend=1 00:04:04.216 --rc geninfo_all_blocks=1 00:04:04.216 ' 00:04:04.216 10:11:41 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:04:04.216 --rc lcov_branch_coverage=1 00:04:04.216 --rc lcov_function_coverage=1 00:04:04.216 --rc genhtml_branch_coverage=1 00:04:04.216 --rc genhtml_function_coverage=1 00:04:04.216 --rc genhtml_legend=1 00:04:04.216 --rc geninfo_all_blocks=1 00:04:04.216 ' 00:04:04.216 10:11:41 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:04:04.216 --rc lcov_branch_coverage=1 00:04:04.216 --rc lcov_function_coverage=1 00:04:04.216 --rc genhtml_branch_coverage=1 00:04:04.216 --rc genhtml_function_coverage=1 00:04:04.216 --rc genhtml_legend=1 00:04:04.216 --rc geninfo_all_blocks=1 00:04:04.216 --no-external' 00:04:04.216 10:11:41 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:04:04.216 --rc lcov_branch_coverage=1 00:04:04.216 --rc lcov_function_coverage=1 00:04:04.216 --rc genhtml_branch_coverage=1 00:04:04.216 --rc genhtml_function_coverage=1 00:04:04.216 --rc genhtml_legend=1 00:04:04.216 --rc geninfo_all_blocks=1 00:04:04.216 --no-external' 00:04:04.216 10:11:41 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:04:04.476 lcov: LCOV version 1.14 00:04:04.476 10:11:41 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:04:09.747 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:04:09.747 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:04:09.748 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:04:09.748 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:04:10.008 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:04:10.008 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:04:10.267 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:04:10.267 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:04:10.267 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:04:10.267 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:04:10.267 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:04:10.267 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:04:10.267 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:04:10.268 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:04:10.268 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:04:32.197 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:04:32.197 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:40.323 10:12:16 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:04:40.323 10:12:16 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:40.323 10:12:16 -- common/autotest_common.sh@10 -- # set +x 00:04:40.323 10:12:16 -- spdk/autotest.sh@91 -- # rm -f 00:04:40.323 10:12:16 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:42.858 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:04:42.858 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:04:42.858 0000:5e:00.0 (8086 0b60): Already using the nvme driver 00:04:42.858 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:42.858 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:43.117 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:43.117 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:43.117 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:43.117 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:43.117 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:43.117 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:43.118 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:43.118 10:12:20 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:04:43.118 10:12:20 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:43.118 10:12:20 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:43.118 10:12:20 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:43.118 10:12:20 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:43.118 10:12:20 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:43.118 10:12:20 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:43.118 10:12:20 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:43.118 10:12:20 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:43.118 10:12:20 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:04:43.118 10:12:20 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:04:43.118 10:12:20 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:04:43.118 10:12:20 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:04:43.118 10:12:20 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:04:43.118 10:12:20 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:43.377 No valid GPT data, bailing 00:04:43.377 10:12:20 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:43.377 10:12:20 -- scripts/common.sh@391 -- # pt= 00:04:43.377 10:12:20 -- scripts/common.sh@392 -- # return 1 00:04:43.377 10:12:20 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:43.377 1+0 records in 00:04:43.377 1+0 records out 00:04:43.377 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00484463 s, 216 MB/s 00:04:43.377 10:12:20 -- spdk/autotest.sh@118 -- # sync 00:04:43.377 10:12:20 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:43.377 10:12:20 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:43.377 10:12:20 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:48.654 10:12:25 -- spdk/autotest.sh@124 -- # uname -s 00:04:48.654 10:12:25 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:48.654 10:12:25 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:48.654 10:12:25 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:48.654 10:12:25 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:48.654 10:12:25 -- common/autotest_common.sh@10 -- # set +x 00:04:48.654 ************************************ 00:04:48.654 START TEST setup.sh 00:04:48.654 ************************************ 00:04:48.654 10:12:25 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:48.654 * Looking for test storage... 00:04:48.654 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:48.654 10:12:25 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:48.654 10:12:25 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:48.654 10:12:25 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:48.654 10:12:25 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:48.654 10:12:25 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:48.654 10:12:25 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:48.654 ************************************ 00:04:48.654 START TEST acl 00:04:48.654 ************************************ 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:48.654 * Looking for test storage... 00:04:48.654 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:48.654 10:12:25 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:48.654 10:12:25 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:48.654 10:12:25 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:48.654 10:12:25 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:48.654 10:12:25 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:48.654 10:12:25 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:48.654 10:12:25 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:48.654 10:12:25 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:48.654 10:12:25 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:52.848 10:12:29 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:52.848 10:12:29 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:52.848 10:12:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:52.848 10:12:29 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:52.848 10:12:29 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:52.848 10:12:29 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:55.384 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:55.384 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.384 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.384 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ (8086 == *:*:*.* ]] 00:04:55.384 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.384 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 Hugepages 00:04:55.643 node hugesize free / total 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 00:04:55.643 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.643 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:85:05.5 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d7:05.5 == *:*:*.* ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # [[ vfio-pci == nvme ]] 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:55.644 10:12:32 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:55.644 10:12:32 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:55.644 10:12:32 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:55.644 10:12:32 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:55.903 ************************************ 00:04:55.903 START TEST denied 00:04:55.903 ************************************ 00:04:55.903 10:12:32 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:04:55.903 10:12:32 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:55.903 10:12:32 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:55.903 10:12:32 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:55.903 10:12:32 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:55.903 10:12:32 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:59.192 0000:5e:00.0 (8086 0b60): Skipping denied controller at 0000:5e:00.0 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:59.192 10:12:36 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:04.505 00:05:04.505 real 0m8.255s 00:05:04.505 user 0m2.478s 00:05:04.505 sys 0m4.943s 00:05:04.505 10:12:41 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:04.505 10:12:41 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:04.505 ************************************ 00:05:04.505 END TEST denied 00:05:04.505 ************************************ 00:05:04.505 10:12:41 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:05:04.506 10:12:41 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:04.506 10:12:41 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:04.506 10:12:41 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:04.506 10:12:41 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:04.506 ************************************ 00:05:04.506 START TEST allowed 00:05:04.506 ************************************ 00:05:04.506 10:12:41 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:05:04.506 10:12:41 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:05:04.506 10:12:41 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:04.506 10:12:41 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:05:04.506 10:12:41 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.506 10:12:41 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:11.068 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:11.068 10:12:47 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:11.068 10:12:47 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:11.068 10:12:47 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:11.068 10:12:47 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:11.068 10:12:47 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:14.358 00:05:14.358 real 0m10.254s 00:05:14.358 user 0m2.699s 00:05:14.358 sys 0m5.023s 00:05:14.358 10:12:51 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:14.358 10:12:51 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:14.358 ************************************ 00:05:14.358 END TEST allowed 00:05:14.358 ************************************ 00:05:14.358 10:12:51 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:05:14.358 00:05:14.358 real 0m26.130s 00:05:14.358 user 0m7.824s 00:05:14.358 sys 0m15.100s 00:05:14.358 10:12:51 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:14.358 10:12:51 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:14.358 ************************************ 00:05:14.358 END TEST acl 00:05:14.358 ************************************ 00:05:14.358 10:12:51 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:14.358 10:12:51 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:14.358 10:12:51 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:14.358 10:12:51 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:14.358 10:12:51 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:14.618 ************************************ 00:05:14.618 START TEST hugepages 00:05:14.618 ************************************ 00:05:14.618 10:12:51 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:14.618 * Looking for test storage... 00:05:14.618 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 76811788 kB' 'MemAvailable: 80110280 kB' 'Buffers: 12176 kB' 'Cached: 9438484 kB' 'SwapCached: 0 kB' 'Active: 6492692 kB' 'Inactive: 3456260 kB' 'Active(anon): 6099108 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 501480 kB' 'Mapped: 158564 kB' 'Shmem: 5600816 kB' 'KReclaimable: 204584 kB' 'Slab: 526896 kB' 'SReclaimable: 204584 kB' 'SUnreclaim: 322312 kB' 'KernelStack: 16160 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438188 kB' 'Committed_AS: 7523872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.618 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.619 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:14.620 10:12:51 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:05:14.620 10:12:51 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:14.620 10:12:51 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:14.620 10:12:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:14.620 ************************************ 00:05:14.620 START TEST default_setup 00:05:14.620 ************************************ 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.620 10:12:51 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:18.808 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:18.808 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:18.808 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:18.808 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:21.343 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.343 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78917672 kB' 'MemAvailable: 82216084 kB' 'Buffers: 12176 kB' 'Cached: 9438604 kB' 'SwapCached: 0 kB' 'Active: 6510500 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116916 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518820 kB' 'Mapped: 158896 kB' 'Shmem: 5600936 kB' 'KReclaimable: 204424 kB' 'Slab: 525940 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321516 kB' 'KernelStack: 16496 kB' 'PageTables: 8440 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7541056 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201160 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.344 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78921996 kB' 'MemAvailable: 82220408 kB' 'Buffers: 12176 kB' 'Cached: 9438608 kB' 'SwapCached: 0 kB' 'Active: 6509964 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116380 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518620 kB' 'Mapped: 158776 kB' 'Shmem: 5600940 kB' 'KReclaimable: 204424 kB' 'Slab: 526044 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321620 kB' 'KernelStack: 16240 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7539592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.345 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.346 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78920488 kB' 'MemAvailable: 82218900 kB' 'Buffers: 12176 kB' 'Cached: 9438620 kB' 'SwapCached: 0 kB' 'Active: 6509408 kB' 'Inactive: 3456260 kB' 'Active(anon): 6115824 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518076 kB' 'Mapped: 158776 kB' 'Shmem: 5600952 kB' 'KReclaimable: 204424 kB' 'Slab: 526044 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321620 kB' 'KernelStack: 16224 kB' 'PageTables: 7840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7541096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.347 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.348 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:21.349 nr_hugepages=1024 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:21.349 resv_hugepages=0 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:21.349 surplus_hugepages=0 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:21.349 anon_hugepages=0 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78920112 kB' 'MemAvailable: 82218524 kB' 'Buffers: 12176 kB' 'Cached: 9438648 kB' 'SwapCached: 0 kB' 'Active: 6509880 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116296 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518508 kB' 'Mapped: 158776 kB' 'Shmem: 5600980 kB' 'KReclaimable: 204424 kB' 'Slab: 525948 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321524 kB' 'KernelStack: 16192 kB' 'PageTables: 7956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7541120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201096 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.349 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.350 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36595968 kB' 'MemUsed: 11520972 kB' 'SwapCached: 0 kB' 'Active: 5291200 kB' 'Inactive: 3372048 kB' 'Active(anon): 5133304 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8442632 kB' 'Mapped: 66688 kB' 'AnonPages: 223752 kB' 'Shmem: 4912688 kB' 'KernelStack: 9208 kB' 'PageTables: 4080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125696 kB' 'Slab: 329560 kB' 'SReclaimable: 125696 kB' 'SUnreclaim: 203864 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.351 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:21.352 node0=1024 expecting 1024 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:21.352 00:05:21.352 real 0m6.467s 00:05:21.352 user 0m1.577s 00:05:21.352 sys 0m2.606s 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:21.352 10:12:58 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:05:21.352 ************************************ 00:05:21.352 END TEST default_setup 00:05:21.352 ************************************ 00:05:21.352 10:12:58 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:21.352 10:12:58 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:05:21.352 10:12:58 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:21.352 10:12:58 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:21.352 10:12:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:21.352 ************************************ 00:05:21.352 START TEST per_node_1G_alloc 00:05:21.352 ************************************ 00:05:21.352 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:05:21.352 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:05:21.352 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.353 10:12:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:24.637 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:24.637 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:24.637 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:24.637 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:24.637 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78931888 kB' 'MemAvailable: 82230300 kB' 'Buffers: 12176 kB' 'Cached: 9438744 kB' 'SwapCached: 0 kB' 'Active: 6509752 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116168 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517864 kB' 'Mapped: 157840 kB' 'Shmem: 5601076 kB' 'KReclaimable: 204424 kB' 'Slab: 525452 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321028 kB' 'KernelStack: 16112 kB' 'PageTables: 7712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7533732 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.637 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78932528 kB' 'MemAvailable: 82230940 kB' 'Buffers: 12176 kB' 'Cached: 9438744 kB' 'SwapCached: 0 kB' 'Active: 6509328 kB' 'Inactive: 3456260 kB' 'Active(anon): 6115744 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517920 kB' 'Mapped: 157676 kB' 'Shmem: 5601076 kB' 'KReclaimable: 204424 kB' 'Slab: 525432 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321008 kB' 'KernelStack: 16144 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7533752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.638 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.902 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78932276 kB' 'MemAvailable: 82230688 kB' 'Buffers: 12176 kB' 'Cached: 9438760 kB' 'SwapCached: 0 kB' 'Active: 6509372 kB' 'Inactive: 3456260 kB' 'Active(anon): 6115788 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517932 kB' 'Mapped: 157676 kB' 'Shmem: 5601092 kB' 'KReclaimable: 204424 kB' 'Slab: 525432 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321008 kB' 'KernelStack: 16128 kB' 'PageTables: 7752 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7533780 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.903 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:24.904 nr_hugepages=1024 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:24.904 resv_hugepages=0 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:24.904 surplus_hugepages=0 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:24.904 anon_hugepages=0 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78930756 kB' 'MemAvailable: 82229168 kB' 'Buffers: 12176 kB' 'Cached: 9438780 kB' 'SwapCached: 0 kB' 'Active: 6512340 kB' 'Inactive: 3456260 kB' 'Active(anon): 6118756 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520872 kB' 'Mapped: 158180 kB' 'Shmem: 5601112 kB' 'KReclaimable: 204424 kB' 'Slab: 525400 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320976 kB' 'KernelStack: 16128 kB' 'PageTables: 7748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7537792 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.904 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:05:24.905 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37633304 kB' 'MemUsed: 10483636 kB' 'SwapCached: 0 kB' 'Active: 5295668 kB' 'Inactive: 3372048 kB' 'Active(anon): 5137772 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8442764 kB' 'Mapped: 67076 kB' 'AnonPages: 228164 kB' 'Shmem: 4912820 kB' 'KernelStack: 9144 kB' 'PageTables: 3896 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125696 kB' 'Slab: 329252 kB' 'SReclaimable: 125696 kB' 'SUnreclaim: 203556 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41293672 kB' 'MemUsed: 2882860 kB' 'SwapCached: 0 kB' 'Active: 1219252 kB' 'Inactive: 84212 kB' 'Active(anon): 983564 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1008220 kB' 'Mapped: 91440 kB' 'AnonPages: 295336 kB' 'Shmem: 688320 kB' 'KernelStack: 6984 kB' 'PageTables: 3880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 78728 kB' 'Slab: 196140 kB' 'SReclaimable: 78728 kB' 'SUnreclaim: 117412 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.906 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:24.907 node0=512 expecting 512 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:24.907 node1=512 expecting 512 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:24.907 00:05:24.907 real 0m3.659s 00:05:24.907 user 0m1.418s 00:05:24.907 sys 0m2.329s 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:24.907 10:13:01 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:24.907 ************************************ 00:05:24.907 END TEST per_node_1G_alloc 00:05:24.907 ************************************ 00:05:24.907 10:13:02 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:24.907 10:13:02 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:05:24.907 10:13:02 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:24.907 10:13:02 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:24.907 10:13:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:24.907 ************************************ 00:05:24.907 START TEST even_2G_alloc 00:05:24.907 ************************************ 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:24.907 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:24.908 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:05:24.908 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:05:24.908 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:05:24.908 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:24.908 10:13:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:29.099 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:29.099 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:29.099 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:29.099 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:29.099 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78933940 kB' 'MemAvailable: 82232352 kB' 'Buffers: 12176 kB' 'Cached: 9438896 kB' 'SwapCached: 0 kB' 'Active: 6509960 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116376 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518332 kB' 'Mapped: 157740 kB' 'Shmem: 5601228 kB' 'KReclaimable: 204424 kB' 'Slab: 525536 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321112 kB' 'KernelStack: 16144 kB' 'PageTables: 7812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7534276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.100 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78934008 kB' 'MemAvailable: 82232420 kB' 'Buffers: 12176 kB' 'Cached: 9438900 kB' 'SwapCached: 0 kB' 'Active: 6509628 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116044 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518032 kB' 'Mapped: 157692 kB' 'Shmem: 5601232 kB' 'KReclaimable: 204424 kB' 'Slab: 525576 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321152 kB' 'KernelStack: 16144 kB' 'PageTables: 7820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7534292 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.101 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.102 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78933048 kB' 'MemAvailable: 82231460 kB' 'Buffers: 12176 kB' 'Cached: 9438916 kB' 'SwapCached: 0 kB' 'Active: 6509824 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116240 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518280 kB' 'Mapped: 157692 kB' 'Shmem: 5601248 kB' 'KReclaimable: 204424 kB' 'Slab: 525576 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321152 kB' 'KernelStack: 16144 kB' 'PageTables: 7864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7533944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200936 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.103 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.104 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:29.105 nr_hugepages=1024 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:29.105 resv_hugepages=0 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:29.105 surplus_hugepages=0 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:29.105 anon_hugepages=0 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78933744 kB' 'MemAvailable: 82232156 kB' 'Buffers: 12176 kB' 'Cached: 9438936 kB' 'SwapCached: 0 kB' 'Active: 6509484 kB' 'Inactive: 3456260 kB' 'Active(anon): 6115900 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517824 kB' 'Mapped: 157692 kB' 'Shmem: 5601268 kB' 'KReclaimable: 204424 kB' 'Slab: 525576 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321152 kB' 'KernelStack: 16080 kB' 'PageTables: 7576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7533972 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200920 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.105 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.106 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37638804 kB' 'MemUsed: 10478136 kB' 'SwapCached: 0 kB' 'Active: 5289884 kB' 'Inactive: 3372048 kB' 'Active(anon): 5131988 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8442872 kB' 'Mapped: 66388 kB' 'AnonPages: 222168 kB' 'Shmem: 4912928 kB' 'KernelStack: 9160 kB' 'PageTables: 3884 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125696 kB' 'Slab: 329204 kB' 'SReclaimable: 125696 kB' 'SUnreclaim: 203508 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.107 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41294436 kB' 'MemUsed: 2882096 kB' 'SwapCached: 0 kB' 'Active: 1219952 kB' 'Inactive: 84212 kB' 'Active(anon): 984264 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1008272 kB' 'Mapped: 91304 kB' 'AnonPages: 295960 kB' 'Shmem: 688372 kB' 'KernelStack: 6984 kB' 'PageTables: 3960 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 78728 kB' 'Slab: 196372 kB' 'SReclaimable: 78728 kB' 'SUnreclaim: 117644 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.108 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.109 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:29.110 node0=512 expecting 512 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:29.110 node1=512 expecting 512 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:29.110 00:05:29.110 real 0m3.839s 00:05:29.110 user 0m1.548s 00:05:29.110 sys 0m2.392s 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:29.110 10:13:05 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:29.110 ************************************ 00:05:29.110 END TEST even_2G_alloc 00:05:29.110 ************************************ 00:05:29.110 10:13:05 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:29.110 10:13:05 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:05:29.110 10:13:05 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:29.110 10:13:05 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.110 10:13:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:29.110 ************************************ 00:05:29.110 START TEST odd_alloc 00:05:29.110 ************************************ 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:29.110 10:13:05 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:32.401 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:32.401 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:32.401 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:32.401 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:32.401 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:32.401 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:32.401 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:05:32.401 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78919260 kB' 'MemAvailable: 82217672 kB' 'Buffers: 12176 kB' 'Cached: 9439048 kB' 'SwapCached: 0 kB' 'Active: 6513488 kB' 'Inactive: 3456260 kB' 'Active(anon): 6119904 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521800 kB' 'Mapped: 158300 kB' 'Shmem: 5601380 kB' 'KReclaimable: 204424 kB' 'Slab: 525952 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321528 kB' 'KernelStack: 16128 kB' 'PageTables: 7788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7539208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200984 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.402 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.700 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.701 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78923596 kB' 'MemAvailable: 82222008 kB' 'Buffers: 12176 kB' 'Cached: 9439052 kB' 'SwapCached: 0 kB' 'Active: 6514172 kB' 'Inactive: 3456260 kB' 'Active(anon): 6120588 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522448 kB' 'Mapped: 158300 kB' 'Shmem: 5601384 kB' 'KReclaimable: 204424 kB' 'Slab: 525920 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321496 kB' 'KernelStack: 16128 kB' 'PageTables: 7780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7539208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.702 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78924100 kB' 'MemAvailable: 82222512 kB' 'Buffers: 12176 kB' 'Cached: 9439052 kB' 'SwapCached: 0 kB' 'Active: 6514172 kB' 'Inactive: 3456260 kB' 'Active(anon): 6120588 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522448 kB' 'Mapped: 158300 kB' 'Shmem: 5601384 kB' 'KReclaimable: 204424 kB' 'Slab: 525920 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321496 kB' 'KernelStack: 16128 kB' 'PageTables: 7780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7539228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.703 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.704 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:32.705 nr_hugepages=1025 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:32.705 resv_hugepages=0 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:32.705 surplus_hugepages=0 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:32.705 anon_hugepages=0 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78926524 kB' 'MemAvailable: 82224936 kB' 'Buffers: 12176 kB' 'Cached: 9439088 kB' 'SwapCached: 0 kB' 'Active: 6510168 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116584 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518364 kB' 'Mapped: 158060 kB' 'Shmem: 5601420 kB' 'KReclaimable: 204424 kB' 'Slab: 525912 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 321488 kB' 'KernelStack: 16112 kB' 'PageTables: 7720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485740 kB' 'Committed_AS: 7534856 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.705 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.706 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37636828 kB' 'MemUsed: 10480112 kB' 'SwapCached: 0 kB' 'Active: 5291712 kB' 'Inactive: 3372048 kB' 'Active(anon): 5133816 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8442992 kB' 'Mapped: 66388 kB' 'AnonPages: 223468 kB' 'Shmem: 4913048 kB' 'KernelStack: 9160 kB' 'PageTables: 3864 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125696 kB' 'Slab: 329640 kB' 'SReclaimable: 125696 kB' 'SUnreclaim: 203944 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.707 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 41289900 kB' 'MemUsed: 2886632 kB' 'SwapCached: 0 kB' 'Active: 1218860 kB' 'Inactive: 84212 kB' 'Active(anon): 983172 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1008272 kB' 'Mapped: 91328 kB' 'AnonPages: 294848 kB' 'Shmem: 688372 kB' 'KernelStack: 6968 kB' 'PageTables: 3880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 78728 kB' 'Slab: 196272 kB' 'SReclaimable: 78728 kB' 'SUnreclaim: 117544 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.708 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.709 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:32.711 node0=512 expecting 513 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:32.711 node1=513 expecting 512 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:32.711 00:05:32.711 real 0m3.811s 00:05:32.711 user 0m1.461s 00:05:32.711 sys 0m2.428s 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:32.711 10:13:09 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:32.711 ************************************ 00:05:32.711 END TEST odd_alloc 00:05:32.711 ************************************ 00:05:32.711 10:13:09 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:32.711 10:13:09 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:32.711 10:13:09 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:32.711 10:13:09 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:32.711 10:13:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:32.711 ************************************ 00:05:32.711 START TEST custom_alloc 00:05:32.711 ************************************ 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:32.711 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:32.712 10:13:09 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:36.911 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:36.911 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:36.911 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:36.911 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:36.911 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:36.911 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:36.911 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:36.911 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:05:36.911 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:36.911 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:36.911 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77865596 kB' 'MemAvailable: 81164008 kB' 'Buffers: 12176 kB' 'Cached: 9439200 kB' 'SwapCached: 0 kB' 'Active: 6509428 kB' 'Inactive: 3456260 kB' 'Active(anon): 6115844 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517480 kB' 'Mapped: 157792 kB' 'Shmem: 5601532 kB' 'KReclaimable: 204424 kB' 'Slab: 524596 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320172 kB' 'KernelStack: 16128 kB' 'PageTables: 7796 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7535476 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200968 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.912 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77865724 kB' 'MemAvailable: 81164136 kB' 'Buffers: 12176 kB' 'Cached: 9439204 kB' 'SwapCached: 0 kB' 'Active: 6509968 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116384 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518020 kB' 'Mapped: 157724 kB' 'Shmem: 5601536 kB' 'KReclaimable: 204424 kB' 'Slab: 524596 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320172 kB' 'KernelStack: 16128 kB' 'PageTables: 7792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7535496 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.913 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.914 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77866180 kB' 'MemAvailable: 81164592 kB' 'Buffers: 12176 kB' 'Cached: 9439220 kB' 'SwapCached: 0 kB' 'Active: 6509640 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116056 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517624 kB' 'Mapped: 157724 kB' 'Shmem: 5601552 kB' 'KReclaimable: 204424 kB' 'Slab: 524652 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320228 kB' 'KernelStack: 16112 kB' 'PageTables: 7756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7535516 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.915 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.916 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:36.917 nr_hugepages=1536 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:36.917 resv_hugepages=0 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:36.917 surplus_hugepages=0 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:36.917 anon_hugepages=0 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.917 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 77865424 kB' 'MemAvailable: 81163836 kB' 'Buffers: 12176 kB' 'Cached: 9439244 kB' 'SwapCached: 0 kB' 'Active: 6509664 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116080 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517624 kB' 'Mapped: 157724 kB' 'Shmem: 5601576 kB' 'KReclaimable: 204424 kB' 'Slab: 524652 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320228 kB' 'KernelStack: 16112 kB' 'PageTables: 7756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962476 kB' 'Committed_AS: 7535540 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200952 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.918 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.919 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 37635808 kB' 'MemUsed: 10481132 kB' 'SwapCached: 0 kB' 'Active: 5290348 kB' 'Inactive: 3372048 kB' 'Active(anon): 5132452 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8443128 kB' 'Mapped: 66388 kB' 'AnonPages: 222388 kB' 'Shmem: 4913184 kB' 'KernelStack: 9128 kB' 'PageTables: 3828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125696 kB' 'Slab: 328668 kB' 'SReclaimable: 125696 kB' 'SUnreclaim: 202972 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.920 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176532 kB' 'MemFree: 40230120 kB' 'MemUsed: 3946412 kB' 'SwapCached: 0 kB' 'Active: 1219236 kB' 'Inactive: 84212 kB' 'Active(anon): 983548 kB' 'Inactive(anon): 0 kB' 'Active(file): 235688 kB' 'Inactive(file): 84212 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1008292 kB' 'Mapped: 91336 kB' 'AnonPages: 295156 kB' 'Shmem: 688392 kB' 'KernelStack: 6984 kB' 'PageTables: 3936 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 78728 kB' 'Slab: 195984 kB' 'SReclaimable: 78728 kB' 'SUnreclaim: 117256 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.921 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:36.922 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:36.923 node0=512 expecting 512 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:36.923 node1=1024 expecting 1024 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:36.923 00:05:36.923 real 0m3.857s 00:05:36.923 user 0m1.543s 00:05:36.923 sys 0m2.412s 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:36.923 10:13:13 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:36.923 ************************************ 00:05:36.923 END TEST custom_alloc 00:05:36.923 ************************************ 00:05:36.923 10:13:13 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:36.923 10:13:13 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:36.923 10:13:13 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:36.923 10:13:13 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:36.923 10:13:13 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:36.923 ************************************ 00:05:36.923 START TEST no_shrink_alloc 00:05:36.923 ************************************ 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:36.923 10:13:13 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:40.212 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:40.212 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:40.212 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:40.212 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:40.212 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78865668 kB' 'MemAvailable: 82164080 kB' 'Buffers: 12176 kB' 'Cached: 9439356 kB' 'SwapCached: 0 kB' 'Active: 6510696 kB' 'Inactive: 3456260 kB' 'Active(anon): 6117112 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518812 kB' 'Mapped: 157904 kB' 'Shmem: 5601688 kB' 'KReclaimable: 204424 kB' 'Slab: 524384 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 319960 kB' 'KernelStack: 16112 kB' 'PageTables: 7760 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200872 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.475 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.476 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78868596 kB' 'MemAvailable: 82167008 kB' 'Buffers: 12176 kB' 'Cached: 9439360 kB' 'SwapCached: 0 kB' 'Active: 6510064 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116480 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518124 kB' 'Mapped: 157736 kB' 'Shmem: 5601692 kB' 'KReclaimable: 204424 kB' 'Slab: 524428 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320004 kB' 'KernelStack: 16112 kB' 'PageTables: 7748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.477 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:40.478 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78869464 kB' 'MemAvailable: 82167876 kB' 'Buffers: 12176 kB' 'Cached: 9439376 kB' 'SwapCached: 0 kB' 'Active: 6510080 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116496 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518124 kB' 'Mapped: 157736 kB' 'Shmem: 5601708 kB' 'KReclaimable: 204424 kB' 'Slab: 524428 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320004 kB' 'KernelStack: 16112 kB' 'PageTables: 7748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.479 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:40.480 nr_hugepages=1024 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:40.480 resv_hugepages=0 00:05:40.480 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:40.481 surplus_hugepages=0 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:40.481 anon_hugepages=0 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78869396 kB' 'MemAvailable: 82167808 kB' 'Buffers: 12176 kB' 'Cached: 9439400 kB' 'SwapCached: 0 kB' 'Active: 6510092 kB' 'Inactive: 3456260 kB' 'Active(anon): 6116508 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518124 kB' 'Mapped: 157736 kB' 'Shmem: 5601732 kB' 'KReclaimable: 204424 kB' 'Slab: 524428 kB' 'SReclaimable: 204424 kB' 'SUnreclaim: 320004 kB' 'KernelStack: 16112 kB' 'PageTables: 7748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7536400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200824 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.481 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36568696 kB' 'MemUsed: 11548244 kB' 'SwapCached: 0 kB' 'Active: 5290508 kB' 'Inactive: 3372048 kB' 'Active(anon): 5132612 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8443300 kB' 'Mapped: 66388 kB' 'AnonPages: 222476 kB' 'Shmem: 4913356 kB' 'KernelStack: 9128 kB' 'PageTables: 3860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125696 kB' 'Slab: 328776 kB' 'SReclaimable: 125696 kB' 'SUnreclaim: 203080 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.482 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.483 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:40.768 node0=1024 expecting 1024 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:40.768 10:13:17 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:44.062 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:05:44.062 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:05:44.062 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:5e:00.0 (8086 0b60): Already using the vfio-pci driver 00:05:44.062 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:44.062 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:44.062 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78871764 kB' 'MemAvailable: 82170160 kB' 'Buffers: 12176 kB' 'Cached: 9439480 kB' 'SwapCached: 0 kB' 'Active: 6512748 kB' 'Inactive: 3456260 kB' 'Active(anon): 6119164 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520684 kB' 'Mapped: 157840 kB' 'Shmem: 5601812 kB' 'KReclaimable: 204392 kB' 'Slab: 524964 kB' 'SReclaimable: 204392 kB' 'SUnreclaim: 320572 kB' 'KernelStack: 16240 kB' 'PageTables: 7984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7537208 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201096 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.062 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.063 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78873376 kB' 'MemAvailable: 82171772 kB' 'Buffers: 12176 kB' 'Cached: 9439484 kB' 'SwapCached: 0 kB' 'Active: 6511796 kB' 'Inactive: 3456260 kB' 'Active(anon): 6118212 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519804 kB' 'Mapped: 157740 kB' 'Shmem: 5601816 kB' 'KReclaimable: 204392 kB' 'Slab: 524964 kB' 'SReclaimable: 204392 kB' 'SUnreclaim: 320572 kB' 'KernelStack: 16144 kB' 'PageTables: 7824 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7537224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201016 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.064 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:44.065 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78874208 kB' 'MemAvailable: 82172604 kB' 'Buffers: 12176 kB' 'Cached: 9439504 kB' 'SwapCached: 0 kB' 'Active: 6511816 kB' 'Inactive: 3456260 kB' 'Active(anon): 6118232 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519820 kB' 'Mapped: 157740 kB' 'Shmem: 5601836 kB' 'KReclaimable: 204392 kB' 'Slab: 525048 kB' 'SReclaimable: 204392 kB' 'SUnreclaim: 320656 kB' 'KernelStack: 16112 kB' 'PageTables: 7764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7537248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.066 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:44.067 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:44.067 nr_hugepages=1024 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:44.068 resv_hugepages=0 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:44.068 surplus_hugepages=0 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:44.068 anon_hugepages=0 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293472 kB' 'MemFree: 78874208 kB' 'MemAvailable: 82172604 kB' 'Buffers: 12176 kB' 'Cached: 9439524 kB' 'SwapCached: 0 kB' 'Active: 6511844 kB' 'Inactive: 3456260 kB' 'Active(anon): 6118260 kB' 'Inactive(anon): 0 kB' 'Active(file): 393584 kB' 'Inactive(file): 3456260 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519820 kB' 'Mapped: 157740 kB' 'Shmem: 5601856 kB' 'KReclaimable: 204392 kB' 'Slab: 525048 kB' 'SReclaimable: 204392 kB' 'SUnreclaim: 320656 kB' 'KernelStack: 16112 kB' 'PageTables: 7764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486764 kB' 'Committed_AS: 7537268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201000 kB' 'VmallocChunk: 0 kB' 'Percpu: 55040 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 789924 kB' 'DirectMap2M: 13565952 kB' 'DirectMap1G: 87031808 kB' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.068 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.069 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116940 kB' 'MemFree: 36562980 kB' 'MemUsed: 11553960 kB' 'SwapCached: 0 kB' 'Active: 5291032 kB' 'Inactive: 3372048 kB' 'Active(anon): 5133136 kB' 'Inactive(anon): 0 kB' 'Active(file): 157896 kB' 'Inactive(file): 3372048 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8443380 kB' 'Mapped: 66388 kB' 'AnonPages: 222900 kB' 'Shmem: 4913436 kB' 'KernelStack: 9144 kB' 'PageTables: 3876 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 125696 kB' 'Slab: 329136 kB' 'SReclaimable: 125696 kB' 'SUnreclaim: 203440 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.330 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:44.331 node0=1024 expecting 1024 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:44.331 00:05:44.331 real 0m7.494s 00:05:44.331 user 0m2.941s 00:05:44.331 sys 0m4.762s 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:44.331 10:13:21 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:44.331 ************************************ 00:05:44.331 END TEST no_shrink_alloc 00:05:44.331 ************************************ 00:05:44.331 10:13:21 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:44.331 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:44.332 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:44.332 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:44.332 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:44.332 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:44.332 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:44.332 10:13:21 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:44.332 00:05:44.332 real 0m29.747s 00:05:44.332 user 0m10.731s 00:05:44.332 sys 0m17.351s 00:05:44.332 10:13:21 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:44.332 10:13:21 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:44.332 ************************************ 00:05:44.332 END TEST hugepages 00:05:44.332 ************************************ 00:05:44.332 10:13:21 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:44.332 10:13:21 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:44.332 10:13:21 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:44.332 10:13:21 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:44.332 10:13:21 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:44.332 ************************************ 00:05:44.332 START TEST driver 00:05:44.332 ************************************ 00:05:44.332 10:13:21 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:44.332 * Looking for test storage... 00:05:44.332 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:44.332 10:13:21 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:44.332 10:13:21 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:44.332 10:13:21 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:49.603 10:13:26 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:49.603 10:13:26 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:49.603 10:13:26 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:49.603 10:13:26 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:49.603 ************************************ 00:05:49.603 START TEST guess_driver 00:05:49.603 ************************************ 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 216 > 0 )) 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:49.603 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:49.603 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:49.603 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:49.603 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:49.603 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:49.603 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:49.603 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:49.603 Looking for driver=vfio-pci 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:49.603 10:13:26 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ not == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:53.792 10:13:30 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:55.692 10:13:32 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:55.692 10:13:32 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:55.692 10:13:32 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:55.950 10:13:32 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:55.950 10:13:32 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:55.950 10:13:32 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:55.950 10:13:32 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:01.229 00:06:01.229 real 0m11.196s 00:06:01.229 user 0m2.769s 00:06:01.229 sys 0m5.254s 00:06:01.229 10:13:37 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:01.229 10:13:37 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:06:01.229 ************************************ 00:06:01.229 END TEST guess_driver 00:06:01.229 ************************************ 00:06:01.229 10:13:37 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:06:01.229 00:06:01.229 real 0m16.304s 00:06:01.229 user 0m4.218s 00:06:01.229 sys 0m8.118s 00:06:01.229 10:13:37 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:01.229 10:13:37 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:06:01.229 ************************************ 00:06:01.229 END TEST driver 00:06:01.229 ************************************ 00:06:01.229 10:13:37 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:01.229 10:13:37 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:01.229 10:13:37 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:01.229 10:13:37 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:01.229 10:13:37 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:01.229 ************************************ 00:06:01.229 START TEST devices 00:06:01.229 ************************************ 00:06:01.229 10:13:37 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:01.229 * Looking for test storage... 00:06:01.229 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:01.229 10:13:37 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:06:01.229 10:13:37 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:06:01.229 10:13:37 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:01.229 10:13:37 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:06:04.557 10:13:41 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:06:04.557 No valid GPT data, bailing 00:06:04.557 10:13:41 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:06:04.557 10:13:41 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:06:04.557 10:13:41 setup.sh.devices -- setup/common.sh@80 -- # echo 7681501126656 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@204 -- # (( 7681501126656 >= min_disk_size )) 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:06:04.557 10:13:41 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:04.557 10:13:41 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:04.557 ************************************ 00:06:04.557 START TEST nvme_mount 00:06:04.557 ************************************ 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:04.557 10:13:41 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:06:05.493 Creating new GPT entries in memory. 00:06:05.493 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:05.493 other utilities. 00:06:05.493 10:13:42 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:05.493 10:13:42 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:05.493 10:13:42 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:05.493 10:13:42 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:05.493 10:13:42 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:06.870 Creating new GPT entries in memory. 00:06:06.870 The operation has completed successfully. 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 417652 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:06.870 10:13:43 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.152 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:10.153 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:10.411 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:10.411 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:10.411 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:10.670 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:10.670 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:06:10.670 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:10.670 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:10.670 10:13:47 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.957 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:13.958 10:13:50 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:17.241 10:13:53 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:17.241 10:13:54 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:17.241 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:17.241 00:06:17.241 real 0m12.438s 00:06:17.242 user 0m3.536s 00:06:17.242 sys 0m6.654s 00:06:17.242 10:13:54 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:17.242 10:13:54 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:17.242 ************************************ 00:06:17.242 END TEST nvme_mount 00:06:17.242 ************************************ 00:06:17.242 10:13:54 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:06:17.242 10:13:54 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:17.242 10:13:54 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:17.242 10:13:54 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:17.242 10:13:54 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:17.242 ************************************ 00:06:17.242 START TEST dm_mount 00:06:17.242 ************************************ 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:17.242 10:13:54 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:18.178 Creating new GPT entries in memory. 00:06:18.178 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:18.178 other utilities. 00:06:18.178 10:13:55 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:18.178 10:13:55 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:18.178 10:13:55 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:18.178 10:13:55 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:18.178 10:13:55 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:19.116 Creating new GPT entries in memory. 00:06:19.116 The operation has completed successfully. 00:06:19.116 10:13:56 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:19.116 10:13:56 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:19.116 10:13:56 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:19.116 10:13:56 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:19.116 10:13:56 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:20.495 The operation has completed successfully. 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 421747 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:20.495 10:13:57 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:23.781 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:24.038 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:24.039 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:24.039 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:24.039 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:24.039 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:24.039 10:14:00 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:24.039 10:14:01 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d7:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:85:05.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:06:27.323 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:27.583 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:27.583 00:06:27.583 real 0m10.497s 00:06:27.583 user 0m2.714s 00:06:27.583 sys 0m4.905s 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.583 10:14:04 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:27.583 ************************************ 00:06:27.583 END TEST dm_mount 00:06:27.583 ************************************ 00:06:27.583 10:14:04 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:06:27.583 10:14:04 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:27.583 10:14:04 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:27.583 10:14:04 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:27.583 10:14:04 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:27.583 10:14:04 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:27.583 10:14:04 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:27.583 10:14:04 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:27.843 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:27.843 /dev/nvme0n1: 8 bytes were erased at offset 0x6fc7d255e00 (gpt): 45 46 49 20 50 41 52 54 00:06:27.843 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:27.843 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:27.843 10:14:04 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:27.843 10:14:05 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:27.843 10:14:05 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:27.843 10:14:05 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:27.843 10:14:05 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:27.843 10:14:05 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:27.843 10:14:05 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:27.843 00:06:27.843 real 0m27.218s 00:06:27.843 user 0m7.627s 00:06:27.843 sys 0m14.343s 00:06:27.843 10:14:05 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:27.843 10:14:05 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:27.843 ************************************ 00:06:27.843 END TEST devices 00:06:27.843 ************************************ 00:06:28.102 10:14:05 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:28.102 00:06:28.102 real 1m39.836s 00:06:28.102 user 0m30.551s 00:06:28.102 sys 0m55.233s 00:06:28.102 10:14:05 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:28.102 10:14:05 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:28.102 ************************************ 00:06:28.102 END TEST setup.sh 00:06:28.102 ************************************ 00:06:28.102 10:14:05 -- common/autotest_common.sh@1142 -- # return 0 00:06:28.102 10:14:05 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:32.365 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:32.365 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:32.365 Hugepages 00:06:32.365 node hugesize free / total 00:06:32.365 node0 1048576kB 0 / 0 00:06:32.365 node0 2048kB 1024 / 1024 00:06:32.365 node1 1048576kB 0 / 0 00:06:32.365 node1 2048kB 1024 / 1024 00:06:32.365 00:06:32.365 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:32.365 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:32.365 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:32.365 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:32.365 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:32.365 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:32.365 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:32.365 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:32.365 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:32.365 NVMe 0000:5e:00.0 8086 0b60 0 nvme nvme0 nvme0n1 00:06:32.365 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:32.365 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:32.365 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:32.365 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:32.365 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:32.365 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:32.365 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:32.365 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:32.365 VMD 0000:85:05.5 8086 201d 1 vfio-pci - - 00:06:32.365 VMD 0000:d7:05.5 8086 201d 1 vfio-pci - - 00:06:32.365 10:14:08 -- spdk/autotest.sh@130 -- # uname -s 00:06:32.365 10:14:08 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:06:32.365 10:14:08 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:06:32.365 10:14:08 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:35.649 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:35.649 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:35.649 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:35.649 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:35.908 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:35.908 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:35.908 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:35.908 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:38.439 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:38.439 10:14:15 -- common/autotest_common.sh@1532 -- # sleep 1 00:06:39.374 10:14:16 -- common/autotest_common.sh@1533 -- # bdfs=() 00:06:39.374 10:14:16 -- common/autotest_common.sh@1533 -- # local bdfs 00:06:39.374 10:14:16 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:06:39.374 10:14:16 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:06:39.374 10:14:16 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:39.374 10:14:16 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:39.374 10:14:16 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:39.374 10:14:16 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:39.374 10:14:16 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:39.374 10:14:16 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:39.374 10:14:16 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:39.374 10:14:16 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:43.550 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:43.550 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:43.550 Waiting for block devices as requested 00:06:43.550 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:06:43.550 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:43.550 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:43.550 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:43.550 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:43.550 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:43.550 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:43.550 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:43.550 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:43.807 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:43.807 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:43.807 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:44.065 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:44.065 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:44.065 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:44.322 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:44.322 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:44.322 10:14:21 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:06:44.323 10:14:21 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:06:44.323 10:14:21 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:06:44.323 10:14:21 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:06:44.323 10:14:21 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1545 -- # grep oacs 00:06:44.323 10:14:21 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:06:44.323 10:14:21 -- common/autotest_common.sh@1545 -- # oacs=' 0x3f' 00:06:44.323 10:14:21 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:06:44.323 10:14:21 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:06:44.323 10:14:21 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:06:44.323 10:14:21 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:06:44.323 10:14:21 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:06:44.323 10:14:21 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:06:44.323 10:14:21 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:06:44.323 10:14:21 -- common/autotest_common.sh@1557 -- # continue 00:06:44.323 10:14:21 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:06:44.323 10:14:21 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:44.323 10:14:21 -- common/autotest_common.sh@10 -- # set +x 00:06:44.579 10:14:21 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:06:44.579 10:14:21 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:44.579 10:14:21 -- common/autotest_common.sh@10 -- # set +x 00:06:44.579 10:14:21 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:47.857 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:06:47.857 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:06:47.857 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:47.857 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:47.857 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:47.857 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:47.857 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:47.857 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:47.857 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:47.857 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:48.115 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:50.657 0000:5e:00.0 (8086 0b60): nvme -> vfio-pci 00:06:50.657 10:14:27 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:50.657 10:14:27 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:50.657 10:14:27 -- common/autotest_common.sh@10 -- # set +x 00:06:50.657 10:14:27 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:50.657 10:14:27 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:50.657 10:14:27 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:50.657 10:14:27 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:50.657 10:14:27 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:50.657 10:14:27 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:50.657 10:14:27 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:50.657 10:14:27 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:50.657 10:14:27 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:50.657 10:14:27 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:50.657 10:14:27 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:50.915 10:14:27 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:50.915 10:14:27 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:50.915 10:14:27 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:50.915 10:14:27 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:50.915 10:14:27 -- common/autotest_common.sh@1580 -- # device=0x0b60 00:06:50.915 10:14:27 -- common/autotest_common.sh@1581 -- # [[ 0x0b60 == \0\x\0\a\5\4 ]] 00:06:50.915 10:14:27 -- common/autotest_common.sh@1586 -- # printf '%s\n' 00:06:50.915 10:14:27 -- common/autotest_common.sh@1592 -- # [[ -z '' ]] 00:06:50.915 10:14:27 -- common/autotest_common.sh@1593 -- # return 0 00:06:50.915 10:14:27 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:50.915 10:14:27 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:50.915 10:14:27 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:50.915 10:14:27 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:50.915 10:14:27 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:51.482 Restarting all devices. 00:06:55.699 lstat() error: No such file or directory 00:06:55.699 QAT Error: No GENERAL section found 00:06:55.699 Failed to configure qat_dev0 00:06:55.699 lstat() error: No such file or directory 00:06:55.699 QAT Error: No GENERAL section found 00:06:55.699 Failed to configure qat_dev1 00:06:55.699 lstat() error: No such file or directory 00:06:55.699 QAT Error: No GENERAL section found 00:06:55.699 Failed to configure qat_dev2 00:06:55.699 enable sriov 00:06:55.699 Checking status of all devices. 00:06:55.699 There is 3 QAT acceleration device(s) in the system: 00:06:55.699 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:55.699 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:55.699 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:56.653 0000:3d:00.0 set to 16 VFs 00:06:58.028 0000:3f:00.0 set to 16 VFs 00:06:59.400 0000:da:00.0 set to 16 VFs 00:07:02.682 Properly configured the qat device with driver uio_pci_generic. 00:07:02.682 10:14:39 -- spdk/autotest.sh@162 -- # timing_enter lib 00:07:02.682 10:14:39 -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:02.682 10:14:39 -- common/autotest_common.sh@10 -- # set +x 00:07:02.682 10:14:39 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:07:02.682 10:14:39 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:02.682 10:14:39 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:02.682 10:14:39 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.682 10:14:39 -- common/autotest_common.sh@10 -- # set +x 00:07:02.682 ************************************ 00:07:02.682 START TEST env 00:07:02.682 ************************************ 00:07:02.682 10:14:39 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:02.682 * Looking for test storage... 00:07:02.682 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:07:02.682 10:14:39 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:02.682 10:14:39 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:02.682 10:14:39 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.682 10:14:39 env -- common/autotest_common.sh@10 -- # set +x 00:07:02.682 ************************************ 00:07:02.682 START TEST env_memory 00:07:02.682 ************************************ 00:07:02.682 10:14:39 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:02.682 00:07:02.682 00:07:02.682 CUnit - A unit testing framework for C - Version 2.1-3 00:07:02.682 http://cunit.sourceforge.net/ 00:07:02.682 00:07:02.682 00:07:02.682 Suite: memory 00:07:02.682 Test: alloc and free memory map ...[2024-07-15 10:14:39.724409] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:07:02.682 passed 00:07:02.682 Test: mem map translation ...[2024-07-15 10:14:39.753743] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:07:02.682 [2024-07-15 10:14:39.753766] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:07:02.682 [2024-07-15 10:14:39.753822] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:07:02.682 [2024-07-15 10:14:39.753837] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:07:02.682 passed 00:07:02.682 Test: mem map registration ...[2024-07-15 10:14:39.811641] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:07:02.682 [2024-07-15 10:14:39.811679] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:07:02.682 passed 00:07:02.942 Test: mem map adjacent registrations ...passed 00:07:02.942 00:07:02.942 Run Summary: Type Total Ran Passed Failed Inactive 00:07:02.942 suites 1 1 n/a 0 0 00:07:02.942 tests 4 4 4 0 0 00:07:02.942 asserts 152 152 152 0 n/a 00:07:02.942 00:07:02.942 Elapsed time = 0.199 seconds 00:07:02.942 00:07:02.942 real 0m0.215s 00:07:02.942 user 0m0.201s 00:07:02.942 sys 0m0.013s 00:07:02.942 10:14:39 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:02.942 10:14:39 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:07:02.942 ************************************ 00:07:02.942 END TEST env_memory 00:07:02.942 ************************************ 00:07:02.942 10:14:39 env -- common/autotest_common.sh@1142 -- # return 0 00:07:02.942 10:14:39 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:02.942 10:14:39 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:02.942 10:14:39 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:02.942 10:14:39 env -- common/autotest_common.sh@10 -- # set +x 00:07:02.942 ************************************ 00:07:02.942 START TEST env_vtophys 00:07:02.942 ************************************ 00:07:02.942 10:14:39 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:02.942 EAL: lib.eal log level changed from notice to debug 00:07:02.942 EAL: Detected lcore 0 as core 0 on socket 0 00:07:02.942 EAL: Detected lcore 1 as core 1 on socket 0 00:07:02.942 EAL: Detected lcore 2 as core 2 on socket 0 00:07:02.942 EAL: Detected lcore 3 as core 3 on socket 0 00:07:02.942 EAL: Detected lcore 4 as core 4 on socket 0 00:07:02.942 EAL: Detected lcore 5 as core 8 on socket 0 00:07:02.942 EAL: Detected lcore 6 as core 9 on socket 0 00:07:02.942 EAL: Detected lcore 7 as core 10 on socket 0 00:07:02.942 EAL: Detected lcore 8 as core 11 on socket 0 00:07:02.942 EAL: Detected lcore 9 as core 16 on socket 0 00:07:02.942 EAL: Detected lcore 10 as core 17 on socket 0 00:07:02.942 EAL: Detected lcore 11 as core 18 on socket 0 00:07:02.942 EAL: Detected lcore 12 as core 19 on socket 0 00:07:02.942 EAL: Detected lcore 13 as core 20 on socket 0 00:07:02.942 EAL: Detected lcore 14 as core 24 on socket 0 00:07:02.942 EAL: Detected lcore 15 as core 25 on socket 0 00:07:02.942 EAL: Detected lcore 16 as core 26 on socket 0 00:07:02.942 EAL: Detected lcore 17 as core 27 on socket 0 00:07:02.942 EAL: Detected lcore 18 as core 0 on socket 1 00:07:02.942 EAL: Detected lcore 19 as core 1 on socket 1 00:07:02.942 EAL: Detected lcore 20 as core 2 on socket 1 00:07:02.942 EAL: Detected lcore 21 as core 3 on socket 1 00:07:02.942 EAL: Detected lcore 22 as core 4 on socket 1 00:07:02.942 EAL: Detected lcore 23 as core 8 on socket 1 00:07:02.942 EAL: Detected lcore 24 as core 9 on socket 1 00:07:02.942 EAL: Detected lcore 25 as core 10 on socket 1 00:07:02.942 EAL: Detected lcore 26 as core 11 on socket 1 00:07:02.942 EAL: Detected lcore 27 as core 16 on socket 1 00:07:02.942 EAL: Detected lcore 28 as core 17 on socket 1 00:07:02.942 EAL: Detected lcore 29 as core 18 on socket 1 00:07:02.942 EAL: Detected lcore 30 as core 19 on socket 1 00:07:02.942 EAL: Detected lcore 31 as core 20 on socket 1 00:07:02.942 EAL: Detected lcore 32 as core 24 on socket 1 00:07:02.942 EAL: Detected lcore 33 as core 25 on socket 1 00:07:02.942 EAL: Detected lcore 34 as core 26 on socket 1 00:07:02.942 EAL: Detected lcore 35 as core 27 on socket 1 00:07:02.942 EAL: Detected lcore 36 as core 0 on socket 0 00:07:02.942 EAL: Detected lcore 37 as core 1 on socket 0 00:07:02.942 EAL: Detected lcore 38 as core 2 on socket 0 00:07:02.942 EAL: Detected lcore 39 as core 3 on socket 0 00:07:02.942 EAL: Detected lcore 40 as core 4 on socket 0 00:07:02.942 EAL: Detected lcore 41 as core 8 on socket 0 00:07:02.942 EAL: Detected lcore 42 as core 9 on socket 0 00:07:02.942 EAL: Detected lcore 43 as core 10 on socket 0 00:07:02.942 EAL: Detected lcore 44 as core 11 on socket 0 00:07:02.942 EAL: Detected lcore 45 as core 16 on socket 0 00:07:02.942 EAL: Detected lcore 46 as core 17 on socket 0 00:07:02.942 EAL: Detected lcore 47 as core 18 on socket 0 00:07:02.942 EAL: Detected lcore 48 as core 19 on socket 0 00:07:02.942 EAL: Detected lcore 49 as core 20 on socket 0 00:07:02.942 EAL: Detected lcore 50 as core 24 on socket 0 00:07:02.942 EAL: Detected lcore 51 as core 25 on socket 0 00:07:02.942 EAL: Detected lcore 52 as core 26 on socket 0 00:07:02.942 EAL: Detected lcore 53 as core 27 on socket 0 00:07:02.942 EAL: Detected lcore 54 as core 0 on socket 1 00:07:02.942 EAL: Detected lcore 55 as core 1 on socket 1 00:07:02.942 EAL: Detected lcore 56 as core 2 on socket 1 00:07:02.942 EAL: Detected lcore 57 as core 3 on socket 1 00:07:02.942 EAL: Detected lcore 58 as core 4 on socket 1 00:07:02.942 EAL: Detected lcore 59 as core 8 on socket 1 00:07:02.942 EAL: Detected lcore 60 as core 9 on socket 1 00:07:02.942 EAL: Detected lcore 61 as core 10 on socket 1 00:07:02.942 EAL: Detected lcore 62 as core 11 on socket 1 00:07:02.942 EAL: Detected lcore 63 as core 16 on socket 1 00:07:02.942 EAL: Detected lcore 64 as core 17 on socket 1 00:07:02.942 EAL: Detected lcore 65 as core 18 on socket 1 00:07:02.942 EAL: Detected lcore 66 as core 19 on socket 1 00:07:02.942 EAL: Detected lcore 67 as core 20 on socket 1 00:07:02.942 EAL: Detected lcore 68 as core 24 on socket 1 00:07:02.942 EAL: Detected lcore 69 as core 25 on socket 1 00:07:02.942 EAL: Detected lcore 70 as core 26 on socket 1 00:07:02.942 EAL: Detected lcore 71 as core 27 on socket 1 00:07:02.942 EAL: Maximum logical cores by configuration: 128 00:07:02.942 EAL: Detected CPU lcores: 72 00:07:02.942 EAL: Detected NUMA nodes: 2 00:07:02.942 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:07:02.942 EAL: Detected shared linkage of DPDK 00:07:02.942 EAL: No shared files mode enabled, IPC will be disabled 00:07:02.942 EAL: No shared files mode enabled, IPC is disabled 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:07:02.942 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:07:02.943 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:07:02.943 EAL: Bus pci wants IOVA as 'PA' 00:07:02.943 EAL: Bus auxiliary wants IOVA as 'DC' 00:07:02.943 EAL: Bus vdev wants IOVA as 'DC' 00:07:02.943 EAL: Selected IOVA mode 'PA' 00:07:02.943 EAL: Probing VFIO support... 00:07:02.943 EAL: IOMMU type 1 (Type 1) is supported 00:07:02.943 EAL: IOMMU type 7 (sPAPR) is not supported 00:07:02.943 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:07:02.943 EAL: VFIO support initialized 00:07:02.943 EAL: Ask a virtual area of 0x2e000 bytes 00:07:02.943 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:07:02.943 EAL: Setting up physically contiguous memory... 00:07:02.943 EAL: Setting maximum number of open files to 524288 00:07:02.943 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:07:02.943 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:07:02.943 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:07:02.943 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:07:02.943 EAL: Ask a virtual area of 0x61000 bytes 00:07:02.943 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:07:02.943 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:02.943 EAL: Ask a virtual area of 0x400000000 bytes 00:07:02.943 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:07:02.943 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:07:02.943 EAL: Hugepages will be freed exactly as allocated. 00:07:02.943 EAL: No shared files mode enabled, IPC is disabled 00:07:02.943 EAL: No shared files mode enabled, IPC is disabled 00:07:02.943 EAL: TSC frequency is ~2300000 KHz 00:07:02.943 EAL: Main lcore 0 is ready (tid=7f5f72ef8b00;cpuset=[0]) 00:07:02.943 EAL: Trying to obtain current memory policy. 00:07:02.943 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:02.943 EAL: Restoring previous memory policy: 0 00:07:02.943 EAL: request: mp_malloc_sync 00:07:02.943 EAL: No shared files mode enabled, IPC is disabled 00:07:02.943 EAL: Heap on socket 0 was expanded by 2MB 00:07:02.943 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001000000 00:07:02.943 EAL: PCI memory mapped at 0x202001001000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001002000 00:07:02.943 EAL: PCI memory mapped at 0x202001003000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001004000 00:07:02.943 EAL: PCI memory mapped at 0x202001005000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001006000 00:07:02.943 EAL: PCI memory mapped at 0x202001007000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001008000 00:07:02.943 EAL: PCI memory mapped at 0x202001009000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x20200100a000 00:07:02.943 EAL: PCI memory mapped at 0x20200100b000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x20200100c000 00:07:02.943 EAL: PCI memory mapped at 0x20200100d000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x20200100e000 00:07:02.943 EAL: PCI memory mapped at 0x20200100f000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001010000 00:07:02.943 EAL: PCI memory mapped at 0x202001011000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001012000 00:07:02.943 EAL: PCI memory mapped at 0x202001013000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001014000 00:07:02.943 EAL: PCI memory mapped at 0x202001015000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001016000 00:07:02.943 EAL: PCI memory mapped at 0x202001017000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001018000 00:07:02.943 EAL: PCI memory mapped at 0x202001019000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x20200101a000 00:07:02.943 EAL: PCI memory mapped at 0x20200101b000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x20200101c000 00:07:02.943 EAL: PCI memory mapped at 0x20200101d000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:02.943 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x20200101e000 00:07:02.943 EAL: PCI memory mapped at 0x20200101f000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:02.943 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001020000 00:07:02.943 EAL: PCI memory mapped at 0x202001021000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:02.943 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001022000 00:07:02.943 EAL: PCI memory mapped at 0x202001023000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:02.943 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001024000 00:07:02.943 EAL: PCI memory mapped at 0x202001025000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:02.943 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001026000 00:07:02.943 EAL: PCI memory mapped at 0x202001027000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:02.943 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.943 EAL: PCI memory mapped at 0x202001028000 00:07:02.943 EAL: PCI memory mapped at 0x202001029000 00:07:02.943 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:02.943 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:07:02.943 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200102a000 00:07:02.944 EAL: PCI memory mapped at 0x20200102b000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200102c000 00:07:02.944 EAL: PCI memory mapped at 0x20200102d000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200102e000 00:07:02.944 EAL: PCI memory mapped at 0x20200102f000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001030000 00:07:02.944 EAL: PCI memory mapped at 0x202001031000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001032000 00:07:02.944 EAL: PCI memory mapped at 0x202001033000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001034000 00:07:02.944 EAL: PCI memory mapped at 0x202001035000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001036000 00:07:02.944 EAL: PCI memory mapped at 0x202001037000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001038000 00:07:02.944 EAL: PCI memory mapped at 0x202001039000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200103a000 00:07:02.944 EAL: PCI memory mapped at 0x20200103b000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200103c000 00:07:02.944 EAL: PCI memory mapped at 0x20200103d000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:02.944 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200103e000 00:07:02.944 EAL: PCI memory mapped at 0x20200103f000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:02.944 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001040000 00:07:02.944 EAL: PCI memory mapped at 0x202001041000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:07:02.944 EAL: Trying to obtain current memory policy. 00:07:02.944 EAL: Setting policy MPOL_PREFERRED for socket 1 00:07:02.944 EAL: Restoring previous memory policy: 4 00:07:02.944 EAL: request: mp_malloc_sync 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: Heap on socket 1 was expanded by 2MB 00:07:02.944 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001042000 00:07:02.944 EAL: PCI memory mapped at 0x202001043000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001044000 00:07:02.944 EAL: PCI memory mapped at 0x202001045000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001046000 00:07:02.944 EAL: PCI memory mapped at 0x202001047000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001048000 00:07:02.944 EAL: PCI memory mapped at 0x202001049000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200104a000 00:07:02.944 EAL: PCI memory mapped at 0x20200104b000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200104c000 00:07:02.944 EAL: PCI memory mapped at 0x20200104d000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200104e000 00:07:02.944 EAL: PCI memory mapped at 0x20200104f000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001050000 00:07:02.944 EAL: PCI memory mapped at 0x202001051000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001052000 00:07:02.944 EAL: PCI memory mapped at 0x202001053000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001054000 00:07:02.944 EAL: PCI memory mapped at 0x202001055000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001056000 00:07:02.944 EAL: PCI memory mapped at 0x202001057000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x202001058000 00:07:02.944 EAL: PCI memory mapped at 0x202001059000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200105a000 00:07:02.944 EAL: PCI memory mapped at 0x20200105b000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200105c000 00:07:02.944 EAL: PCI memory mapped at 0x20200105d000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:07:02.944 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:07:02.944 EAL: probe driver: 8086:37c9 qat 00:07:02.944 EAL: PCI memory mapped at 0x20200105e000 00:07:02.944 EAL: PCI memory mapped at 0x20200105f000 00:07:02.944 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: No PCI address specified using 'addr=' in: bus=pci 00:07:02.944 EAL: Mem event callback 'spdk:(nil)' registered 00:07:02.944 00:07:02.944 00:07:02.944 CUnit - A unit testing framework for C - Version 2.1-3 00:07:02.944 http://cunit.sourceforge.net/ 00:07:02.944 00:07:02.944 00:07:02.944 Suite: components_suite 00:07:02.944 Test: vtophys_malloc_test ...passed 00:07:02.944 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:07:02.944 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:02.944 EAL: Restoring previous memory policy: 4 00:07:02.944 EAL: Calling mem event callback 'spdk:(nil)' 00:07:02.944 EAL: request: mp_malloc_sync 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: Heap on socket 0 was expanded by 4MB 00:07:02.944 EAL: Calling mem event callback 'spdk:(nil)' 00:07:02.944 EAL: request: mp_malloc_sync 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: Heap on socket 0 was shrunk by 4MB 00:07:02.944 EAL: Trying to obtain current memory policy. 00:07:02.944 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:02.944 EAL: Restoring previous memory policy: 4 00:07:02.944 EAL: Calling mem event callback 'spdk:(nil)' 00:07:02.944 EAL: request: mp_malloc_sync 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: Heap on socket 0 was expanded by 6MB 00:07:02.944 EAL: Calling mem event callback 'spdk:(nil)' 00:07:02.944 EAL: request: mp_malloc_sync 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: Heap on socket 0 was shrunk by 6MB 00:07:02.944 EAL: Trying to obtain current memory policy. 00:07:02.944 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:02.944 EAL: Restoring previous memory policy: 4 00:07:02.944 EAL: Calling mem event callback 'spdk:(nil)' 00:07:02.944 EAL: request: mp_malloc_sync 00:07:02.944 EAL: No shared files mode enabled, IPC is disabled 00:07:02.944 EAL: Heap on socket 0 was expanded by 10MB 00:07:02.944 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was shrunk by 10MB 00:07:03.202 EAL: Trying to obtain current memory policy. 00:07:03.202 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:03.202 EAL: Restoring previous memory policy: 4 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was expanded by 18MB 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was shrunk by 18MB 00:07:03.202 EAL: Trying to obtain current memory policy. 00:07:03.202 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:03.202 EAL: Restoring previous memory policy: 4 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was expanded by 34MB 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was shrunk by 34MB 00:07:03.202 EAL: Trying to obtain current memory policy. 00:07:03.202 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:03.202 EAL: Restoring previous memory policy: 4 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was expanded by 66MB 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was shrunk by 66MB 00:07:03.202 EAL: Trying to obtain current memory policy. 00:07:03.202 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:03.202 EAL: Restoring previous memory policy: 4 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was expanded by 130MB 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was shrunk by 130MB 00:07:03.202 EAL: Trying to obtain current memory policy. 00:07:03.202 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:03.202 EAL: Restoring previous memory policy: 4 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.202 EAL: request: mp_malloc_sync 00:07:03.202 EAL: No shared files mode enabled, IPC is disabled 00:07:03.202 EAL: Heap on socket 0 was expanded by 258MB 00:07:03.202 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.460 EAL: request: mp_malloc_sync 00:07:03.460 EAL: No shared files mode enabled, IPC is disabled 00:07:03.460 EAL: Heap on socket 0 was shrunk by 258MB 00:07:03.460 EAL: Trying to obtain current memory policy. 00:07:03.460 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:03.460 EAL: Restoring previous memory policy: 4 00:07:03.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.460 EAL: request: mp_malloc_sync 00:07:03.460 EAL: No shared files mode enabled, IPC is disabled 00:07:03.460 EAL: Heap on socket 0 was expanded by 514MB 00:07:03.460 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.718 EAL: request: mp_malloc_sync 00:07:03.718 EAL: No shared files mode enabled, IPC is disabled 00:07:03.718 EAL: Heap on socket 0 was shrunk by 514MB 00:07:03.718 EAL: Trying to obtain current memory policy. 00:07:03.718 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:03.976 EAL: Restoring previous memory policy: 4 00:07:03.976 EAL: Calling mem event callback 'spdk:(nil)' 00:07:03.976 EAL: request: mp_malloc_sync 00:07:03.976 EAL: No shared files mode enabled, IPC is disabled 00:07:03.976 EAL: Heap on socket 0 was expanded by 1026MB 00:07:04.234 EAL: Calling mem event callback 'spdk:(nil)' 00:07:04.234 EAL: request: mp_malloc_sync 00:07:04.234 EAL: No shared files mode enabled, IPC is disabled 00:07:04.234 EAL: Heap on socket 0 was shrunk by 1026MB 00:07:04.234 passed 00:07:04.234 00:07:04.234 Run Summary: Type Total Ran Passed Failed Inactive 00:07:04.234 suites 1 1 n/a 0 0 00:07:04.234 tests 2 2 2 0 0 00:07:04.234 asserts 5603 5603 5603 0 n/a 00:07:04.234 00:07:04.234 Elapsed time = 1.209 seconds 00:07:04.234 EAL: No shared files mode enabled, IPC is disabled 00:07:04.234 EAL: No shared files mode enabled, IPC is disabled 00:07:04.234 EAL: No shared files mode enabled, IPC is disabled 00:07:04.234 00:07:04.234 real 0m1.407s 00:07:04.234 user 0m0.786s 00:07:04.234 sys 0m0.591s 00:07:04.234 10:14:41 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:04.234 10:14:41 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:07:04.234 ************************************ 00:07:04.234 END TEST env_vtophys 00:07:04.234 ************************************ 00:07:04.234 10:14:41 env -- common/autotest_common.sh@1142 -- # return 0 00:07:04.234 10:14:41 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:04.234 10:14:41 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:04.234 10:14:41 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.234 10:14:41 env -- common/autotest_common.sh@10 -- # set +x 00:07:04.493 ************************************ 00:07:04.493 START TEST env_pci 00:07:04.493 ************************************ 00:07:04.493 10:14:41 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:04.493 00:07:04.493 00:07:04.493 CUnit - A unit testing framework for C - Version 2.1-3 00:07:04.493 http://cunit.sourceforge.net/ 00:07:04.493 00:07:04.493 00:07:04.493 Suite: pci 00:07:04.493 Test: pci_hook ...[2024-07-15 10:14:41.475309] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 433011 has claimed it 00:07:04.493 EAL: Cannot find device (10000:00:01.0) 00:07:04.493 EAL: Failed to attach device on primary process 00:07:04.493 passed 00:07:04.493 00:07:04.493 Run Summary: Type Total Ran Passed Failed Inactive 00:07:04.493 suites 1 1 n/a 0 0 00:07:04.493 tests 1 1 1 0 0 00:07:04.493 asserts 25 25 25 0 n/a 00:07:04.493 00:07:04.494 Elapsed time = 0.041 seconds 00:07:04.494 00:07:04.494 real 0m0.068s 00:07:04.494 user 0m0.019s 00:07:04.494 sys 0m0.049s 00:07:04.494 10:14:41 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:04.494 10:14:41 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:07:04.494 ************************************ 00:07:04.494 END TEST env_pci 00:07:04.494 ************************************ 00:07:04.494 10:14:41 env -- common/autotest_common.sh@1142 -- # return 0 00:07:04.494 10:14:41 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:07:04.494 10:14:41 env -- env/env.sh@15 -- # uname 00:07:04.494 10:14:41 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:07:04.494 10:14:41 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:07:04.494 10:14:41 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:04.494 10:14:41 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:07:04.494 10:14:41 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:04.494 10:14:41 env -- common/autotest_common.sh@10 -- # set +x 00:07:04.494 ************************************ 00:07:04.494 START TEST env_dpdk_post_init 00:07:04.494 ************************************ 00:07:04.494 10:14:41 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:04.494 EAL: Detected CPU lcores: 72 00:07:04.494 EAL: Detected NUMA nodes: 2 00:07:04.494 EAL: Detected shared linkage of DPDK 00:07:04.494 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:04.494 EAL: Selected IOVA mode 'PA' 00:07:04.494 EAL: VFIO support initialized 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.494 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.494 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:07:04.494 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.754 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.754 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:07:04.754 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:04.755 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:07:04.755 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:04.755 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:04.755 EAL: Using IOMMU type 1 (Type 1) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:07:04.755 EAL: Ignore mapping IO port bar(1) 00:07:04.755 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:07:05.013 EAL: Probe PCI driver: spdk_nvme (8086:0b60) device: 0000:5e:00.0 (socket 0) 00:07:05.013 EAL: Ignore mapping IO port bar(1) 00:07:05.013 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:07:05.013 EAL: Ignore mapping IO port bar(1) 00:07:05.013 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:07:05.013 EAL: Ignore mapping IO port bar(1) 00:07:05.013 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:07:05.013 EAL: Ignore mapping IO port bar(1) 00:07:05.013 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:07:05.013 EAL: Ignore mapping IO port bar(1) 00:07:05.013 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:07:05.271 EAL: Ignore mapping IO port bar(1) 00:07:05.271 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:07:05.271 EAL: Ignore mapping IO port bar(1) 00:07:05.271 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:07:05.271 EAL: Ignore mapping IO port bar(1) 00:07:05.271 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:07:05.271 EAL: Ignore mapping IO port bar(1) 00:07:05.271 EAL: Ignore mapping IO port bar(5) 00:07:05.271 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:85:05.5 (socket 1) 00:07:05.271 EAL: Ignore mapping IO port bar(1) 00:07:05.271 EAL: Ignore mapping IO port bar(5) 00:07:05.271 EAL: Probe PCI driver: spdk_vmd (8086:201d) device: 0000:d7:05.5 (socket 1) 00:07:08.545 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:07:08.545 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:07:08.545 Starting DPDK initialization... 00:07:08.545 Starting SPDK post initialization... 00:07:08.545 SPDK NVMe probe 00:07:08.545 Attaching to 0000:5e:00.0 00:07:08.545 Attached to 0000:5e:00.0 00:07:08.545 Cleaning up... 00:07:08.545 00:07:08.545 real 0m3.524s 00:07:08.545 user 0m2.422s 00:07:08.545 sys 0m0.660s 00:07:08.545 10:14:45 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.545 10:14:45 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:07:08.545 ************************************ 00:07:08.545 END TEST env_dpdk_post_init 00:07:08.545 ************************************ 00:07:08.545 10:14:45 env -- common/autotest_common.sh@1142 -- # return 0 00:07:08.545 10:14:45 env -- env/env.sh@26 -- # uname 00:07:08.545 10:14:45 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:07:08.545 10:14:45 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:08.545 10:14:45 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:08.545 10:14:45 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.545 10:14:45 env -- common/autotest_common.sh@10 -- # set +x 00:07:08.545 ************************************ 00:07:08.545 START TEST env_mem_callbacks 00:07:08.545 ************************************ 00:07:08.545 10:14:45 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:08.545 EAL: Detected CPU lcores: 72 00:07:08.545 EAL: Detected NUMA nodes: 2 00:07:08.545 EAL: Detected shared linkage of DPDK 00:07:08.545 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:08.545 EAL: Selected IOVA mode 'PA' 00:07:08.545 EAL: VFIO support initialized 00:07:08.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:07:08.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:07:08.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:07:08.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:07:08.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:07:08.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:07:08.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:07:08.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.545 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.546 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:07:08.546 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:08.547 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:07:08.547 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:08.547 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:08.547 00:07:08.547 00:07:08.547 CUnit - A unit testing framework for C - Version 2.1-3 00:07:08.547 http://cunit.sourceforge.net/ 00:07:08.547 00:07:08.547 00:07:08.547 Suite: memory 00:07:08.547 Test: test ... 00:07:08.547 register 0x200000200000 2097152 00:07:08.547 register 0x201000a00000 2097152 00:07:08.547 malloc 3145728 00:07:08.547 register 0x200000400000 4194304 00:07:08.547 buf 0x200000500000 len 3145728 PASSED 00:07:08.547 malloc 64 00:07:08.547 buf 0x2000004fff40 len 64 PASSED 00:07:08.547 malloc 4194304 00:07:08.547 register 0x200000800000 6291456 00:07:08.547 buf 0x200000a00000 len 4194304 PASSED 00:07:08.547 free 0x200000500000 3145728 00:07:08.547 free 0x2000004fff40 64 00:07:08.547 unregister 0x200000400000 4194304 PASSED 00:07:08.547 free 0x200000a00000 4194304 00:07:08.547 unregister 0x200000800000 6291456 PASSED 00:07:08.547 malloc 8388608 00:07:08.547 register 0x200000400000 10485760 00:07:08.547 buf 0x200000600000 len 8388608 PASSED 00:07:08.547 free 0x200000600000 8388608 00:07:08.547 unregister 0x200000400000 10485760 PASSED 00:07:08.547 passed 00:07:08.547 00:07:08.547 Run Summary: Type Total Ran Passed Failed Inactive 00:07:08.547 suites 1 1 n/a 0 0 00:07:08.547 tests 1 1 1 0 0 00:07:08.547 asserts 16 16 16 0 n/a 00:07:08.547 00:07:08.547 Elapsed time = 0.005 seconds 00:07:08.547 00:07:08.547 real 0m0.107s 00:07:08.547 user 0m0.029s 00:07:08.547 sys 0m0.078s 00:07:08.547 10:14:45 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.547 10:14:45 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:07:08.547 ************************************ 00:07:08.547 END TEST env_mem_callbacks 00:07:08.547 ************************************ 00:07:08.547 10:14:45 env -- common/autotest_common.sh@1142 -- # return 0 00:07:08.547 00:07:08.547 real 0m5.822s 00:07:08.547 user 0m3.645s 00:07:08.547 sys 0m1.744s 00:07:08.547 10:14:45 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:08.547 10:14:45 env -- common/autotest_common.sh@10 -- # set +x 00:07:08.547 ************************************ 00:07:08.547 END TEST env 00:07:08.547 ************************************ 00:07:08.548 10:14:45 -- common/autotest_common.sh@1142 -- # return 0 00:07:08.548 10:14:45 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:08.548 10:14:45 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:08.548 10:14:45 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:08.548 10:14:45 -- common/autotest_common.sh@10 -- # set +x 00:07:08.548 ************************************ 00:07:08.548 START TEST rpc 00:07:08.548 ************************************ 00:07:08.548 10:14:45 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:08.548 * Looking for test storage... 00:07:08.548 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:08.548 10:14:45 rpc -- rpc/rpc.sh@65 -- # spdk_pid=433683 00:07:08.548 10:14:45 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:07:08.548 10:14:45 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:08.548 10:14:45 rpc -- rpc/rpc.sh@67 -- # waitforlisten 433683 00:07:08.548 10:14:45 rpc -- common/autotest_common.sh@829 -- # '[' -z 433683 ']' 00:07:08.548 10:14:45 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.548 10:14:45 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:08.548 10:14:45 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.548 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.548 10:14:45 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:08.548 10:14:45 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:08.548 [2024-07-15 10:14:45.620612] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:08.548 [2024-07-15 10:14:45.620693] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid433683 ] 00:07:08.805 [2024-07-15 10:14:45.751685] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.805 [2024-07-15 10:14:45.853683] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:07:08.805 [2024-07-15 10:14:45.853738] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 433683' to capture a snapshot of events at runtime. 00:07:08.805 [2024-07-15 10:14:45.853753] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:08.805 [2024-07-15 10:14:45.853767] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:08.805 [2024-07-15 10:14:45.853778] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid433683 for offline analysis/debug. 00:07:08.805 [2024-07-15 10:14:45.853810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.368 10:14:46 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:09.368 10:14:46 rpc -- common/autotest_common.sh@862 -- # return 0 00:07:09.368 10:14:46 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:09.368 10:14:46 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:09.368 10:14:46 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:07:09.368 10:14:46 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:07:09.368 10:14:46 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:09.368 10:14:46 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.368 10:14:46 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:09.626 ************************************ 00:07:09.626 START TEST rpc_integrity 00:07:09.626 ************************************ 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:09.626 { 00:07:09.626 "name": "Malloc0", 00:07:09.626 "aliases": [ 00:07:09.626 "b254d9a2-a52d-40d9-8c95-463f965154bb" 00:07:09.626 ], 00:07:09.626 "product_name": "Malloc disk", 00:07:09.626 "block_size": 512, 00:07:09.626 "num_blocks": 16384, 00:07:09.626 "uuid": "b254d9a2-a52d-40d9-8c95-463f965154bb", 00:07:09.626 "assigned_rate_limits": { 00:07:09.626 "rw_ios_per_sec": 0, 00:07:09.626 "rw_mbytes_per_sec": 0, 00:07:09.626 "r_mbytes_per_sec": 0, 00:07:09.626 "w_mbytes_per_sec": 0 00:07:09.626 }, 00:07:09.626 "claimed": false, 00:07:09.626 "zoned": false, 00:07:09.626 "supported_io_types": { 00:07:09.626 "read": true, 00:07:09.626 "write": true, 00:07:09.626 "unmap": true, 00:07:09.626 "flush": true, 00:07:09.626 "reset": true, 00:07:09.626 "nvme_admin": false, 00:07:09.626 "nvme_io": false, 00:07:09.626 "nvme_io_md": false, 00:07:09.626 "write_zeroes": true, 00:07:09.626 "zcopy": true, 00:07:09.626 "get_zone_info": false, 00:07:09.626 "zone_management": false, 00:07:09.626 "zone_append": false, 00:07:09.626 "compare": false, 00:07:09.626 "compare_and_write": false, 00:07:09.626 "abort": true, 00:07:09.626 "seek_hole": false, 00:07:09.626 "seek_data": false, 00:07:09.626 "copy": true, 00:07:09.626 "nvme_iov_md": false 00:07:09.626 }, 00:07:09.626 "memory_domains": [ 00:07:09.626 { 00:07:09.626 "dma_device_id": "system", 00:07:09.626 "dma_device_type": 1 00:07:09.626 }, 00:07:09.626 { 00:07:09.626 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:09.626 "dma_device_type": 2 00:07:09.626 } 00:07:09.626 ], 00:07:09.626 "driver_specific": {} 00:07:09.626 } 00:07:09.626 ]' 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.626 [2024-07-15 10:14:46.729534] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:07:09.626 [2024-07-15 10:14:46.729576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:09.626 [2024-07-15 10:14:46.729596] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2322eb0 00:07:09.626 [2024-07-15 10:14:46.729609] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:09.626 [2024-07-15 10:14:46.731111] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:09.626 [2024-07-15 10:14:46.731139] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:09.626 Passthru0 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.626 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.626 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:09.626 { 00:07:09.626 "name": "Malloc0", 00:07:09.626 "aliases": [ 00:07:09.626 "b254d9a2-a52d-40d9-8c95-463f965154bb" 00:07:09.626 ], 00:07:09.626 "product_name": "Malloc disk", 00:07:09.626 "block_size": 512, 00:07:09.626 "num_blocks": 16384, 00:07:09.626 "uuid": "b254d9a2-a52d-40d9-8c95-463f965154bb", 00:07:09.626 "assigned_rate_limits": { 00:07:09.626 "rw_ios_per_sec": 0, 00:07:09.626 "rw_mbytes_per_sec": 0, 00:07:09.626 "r_mbytes_per_sec": 0, 00:07:09.626 "w_mbytes_per_sec": 0 00:07:09.626 }, 00:07:09.626 "claimed": true, 00:07:09.626 "claim_type": "exclusive_write", 00:07:09.626 "zoned": false, 00:07:09.626 "supported_io_types": { 00:07:09.626 "read": true, 00:07:09.626 "write": true, 00:07:09.626 "unmap": true, 00:07:09.626 "flush": true, 00:07:09.626 "reset": true, 00:07:09.626 "nvme_admin": false, 00:07:09.626 "nvme_io": false, 00:07:09.626 "nvme_io_md": false, 00:07:09.626 "write_zeroes": true, 00:07:09.626 "zcopy": true, 00:07:09.626 "get_zone_info": false, 00:07:09.626 "zone_management": false, 00:07:09.626 "zone_append": false, 00:07:09.626 "compare": false, 00:07:09.626 "compare_and_write": false, 00:07:09.626 "abort": true, 00:07:09.626 "seek_hole": false, 00:07:09.626 "seek_data": false, 00:07:09.626 "copy": true, 00:07:09.626 "nvme_iov_md": false 00:07:09.626 }, 00:07:09.626 "memory_domains": [ 00:07:09.626 { 00:07:09.626 "dma_device_id": "system", 00:07:09.626 "dma_device_type": 1 00:07:09.626 }, 00:07:09.626 { 00:07:09.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:09.627 "dma_device_type": 2 00:07:09.627 } 00:07:09.627 ], 00:07:09.627 "driver_specific": {} 00:07:09.627 }, 00:07:09.627 { 00:07:09.627 "name": "Passthru0", 00:07:09.627 "aliases": [ 00:07:09.627 "43ab3abd-388a-57f1-90c1-1035301100ad" 00:07:09.627 ], 00:07:09.627 "product_name": "passthru", 00:07:09.627 "block_size": 512, 00:07:09.627 "num_blocks": 16384, 00:07:09.627 "uuid": "43ab3abd-388a-57f1-90c1-1035301100ad", 00:07:09.627 "assigned_rate_limits": { 00:07:09.627 "rw_ios_per_sec": 0, 00:07:09.627 "rw_mbytes_per_sec": 0, 00:07:09.627 "r_mbytes_per_sec": 0, 00:07:09.627 "w_mbytes_per_sec": 0 00:07:09.627 }, 00:07:09.627 "claimed": false, 00:07:09.627 "zoned": false, 00:07:09.627 "supported_io_types": { 00:07:09.627 "read": true, 00:07:09.627 "write": true, 00:07:09.627 "unmap": true, 00:07:09.627 "flush": true, 00:07:09.627 "reset": true, 00:07:09.627 "nvme_admin": false, 00:07:09.627 "nvme_io": false, 00:07:09.627 "nvme_io_md": false, 00:07:09.627 "write_zeroes": true, 00:07:09.627 "zcopy": true, 00:07:09.627 "get_zone_info": false, 00:07:09.627 "zone_management": false, 00:07:09.627 "zone_append": false, 00:07:09.627 "compare": false, 00:07:09.627 "compare_and_write": false, 00:07:09.627 "abort": true, 00:07:09.627 "seek_hole": false, 00:07:09.627 "seek_data": false, 00:07:09.627 "copy": true, 00:07:09.627 "nvme_iov_md": false 00:07:09.627 }, 00:07:09.627 "memory_domains": [ 00:07:09.627 { 00:07:09.627 "dma_device_id": "system", 00:07:09.627 "dma_device_type": 1 00:07:09.627 }, 00:07:09.627 { 00:07:09.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:09.627 "dma_device_type": 2 00:07:09.627 } 00:07:09.627 ], 00:07:09.627 "driver_specific": { 00:07:09.627 "passthru": { 00:07:09.627 "name": "Passthru0", 00:07:09.627 "base_bdev_name": "Malloc0" 00:07:09.627 } 00:07:09.627 } 00:07:09.627 } 00:07:09.627 ]' 00:07:09.627 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:09.627 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:09.627 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.627 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.627 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.627 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.884 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.884 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:09.884 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:09.884 10:14:46 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:09.884 00:07:09.884 real 0m0.288s 00:07:09.884 user 0m0.183s 00:07:09.884 sys 0m0.046s 00:07:09.884 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:09.884 10:14:46 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:09.884 ************************************ 00:07:09.884 END TEST rpc_integrity 00:07:09.884 ************************************ 00:07:09.884 10:14:46 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:09.884 10:14:46 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:07:09.884 10:14:46 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:09.884 10:14:46 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:09.884 10:14:46 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:09.884 ************************************ 00:07:09.884 START TEST rpc_plugins 00:07:09.884 ************************************ 00:07:09.884 10:14:46 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:07:09.884 10:14:46 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:07:09.884 10:14:46 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.884 10:14:46 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:09.884 10:14:46 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.884 10:14:46 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:07:09.884 10:14:46 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:07:09.884 10:14:46 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.884 10:14:46 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:09.884 10:14:46 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.884 10:14:46 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:07:09.884 { 00:07:09.884 "name": "Malloc1", 00:07:09.884 "aliases": [ 00:07:09.884 "6f25a63e-584c-41d4-8229-aabc5944c1d2" 00:07:09.884 ], 00:07:09.884 "product_name": "Malloc disk", 00:07:09.884 "block_size": 4096, 00:07:09.884 "num_blocks": 256, 00:07:09.884 "uuid": "6f25a63e-584c-41d4-8229-aabc5944c1d2", 00:07:09.884 "assigned_rate_limits": { 00:07:09.884 "rw_ios_per_sec": 0, 00:07:09.884 "rw_mbytes_per_sec": 0, 00:07:09.884 "r_mbytes_per_sec": 0, 00:07:09.884 "w_mbytes_per_sec": 0 00:07:09.884 }, 00:07:09.884 "claimed": false, 00:07:09.884 "zoned": false, 00:07:09.884 "supported_io_types": { 00:07:09.884 "read": true, 00:07:09.884 "write": true, 00:07:09.884 "unmap": true, 00:07:09.884 "flush": true, 00:07:09.884 "reset": true, 00:07:09.884 "nvme_admin": false, 00:07:09.884 "nvme_io": false, 00:07:09.884 "nvme_io_md": false, 00:07:09.884 "write_zeroes": true, 00:07:09.884 "zcopy": true, 00:07:09.884 "get_zone_info": false, 00:07:09.884 "zone_management": false, 00:07:09.884 "zone_append": false, 00:07:09.884 "compare": false, 00:07:09.884 "compare_and_write": false, 00:07:09.884 "abort": true, 00:07:09.884 "seek_hole": false, 00:07:09.884 "seek_data": false, 00:07:09.884 "copy": true, 00:07:09.884 "nvme_iov_md": false 00:07:09.884 }, 00:07:09.884 "memory_domains": [ 00:07:09.884 { 00:07:09.884 "dma_device_id": "system", 00:07:09.884 "dma_device_type": 1 00:07:09.884 }, 00:07:09.884 { 00:07:09.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:09.884 "dma_device_type": 2 00:07:09.884 } 00:07:09.884 ], 00:07:09.884 "driver_specific": {} 00:07:09.884 } 00:07:09.884 ]' 00:07:09.884 10:14:46 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:07:09.884 10:14:47 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:07:09.884 10:14:47 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:07:09.884 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.884 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:09.884 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.884 10:14:47 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:07:09.884 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:09.884 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:09.884 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:09.884 10:14:47 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:07:09.884 10:14:47 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:07:10.141 10:14:47 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:07:10.141 00:07:10.141 real 0m0.147s 00:07:10.141 user 0m0.095s 00:07:10.141 sys 0m0.021s 00:07:10.141 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.141 10:14:47 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:10.141 ************************************ 00:07:10.141 END TEST rpc_plugins 00:07:10.141 ************************************ 00:07:10.141 10:14:47 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:10.141 10:14:47 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:07:10.141 10:14:47 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:10.141 10:14:47 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.141 10:14:47 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:10.141 ************************************ 00:07:10.142 START TEST rpc_trace_cmd_test 00:07:10.142 ************************************ 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:07:10.142 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid433683", 00:07:10.142 "tpoint_group_mask": "0x8", 00:07:10.142 "iscsi_conn": { 00:07:10.142 "mask": "0x2", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "scsi": { 00:07:10.142 "mask": "0x4", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "bdev": { 00:07:10.142 "mask": "0x8", 00:07:10.142 "tpoint_mask": "0xffffffffffffffff" 00:07:10.142 }, 00:07:10.142 "nvmf_rdma": { 00:07:10.142 "mask": "0x10", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "nvmf_tcp": { 00:07:10.142 "mask": "0x20", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "ftl": { 00:07:10.142 "mask": "0x40", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "blobfs": { 00:07:10.142 "mask": "0x80", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "dsa": { 00:07:10.142 "mask": "0x200", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "thread": { 00:07:10.142 "mask": "0x400", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "nvme_pcie": { 00:07:10.142 "mask": "0x800", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "iaa": { 00:07:10.142 "mask": "0x1000", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "nvme_tcp": { 00:07:10.142 "mask": "0x2000", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "bdev_nvme": { 00:07:10.142 "mask": "0x4000", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 }, 00:07:10.142 "sock": { 00:07:10.142 "mask": "0x8000", 00:07:10.142 "tpoint_mask": "0x0" 00:07:10.142 } 00:07:10.142 }' 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:07:10.142 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:07:10.399 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:07:10.399 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:07:10.399 10:14:47 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:07:10.399 00:07:10.399 real 0m0.241s 00:07:10.399 user 0m0.197s 00:07:10.399 sys 0m0.034s 00:07:10.399 10:14:47 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.399 10:14:47 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:10.399 ************************************ 00:07:10.399 END TEST rpc_trace_cmd_test 00:07:10.399 ************************************ 00:07:10.399 10:14:47 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:10.399 10:14:47 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:07:10.399 10:14:47 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:07:10.399 10:14:47 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:07:10.399 10:14:47 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:10.399 10:14:47 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:10.399 10:14:47 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:10.399 ************************************ 00:07:10.399 START TEST rpc_daemon_integrity 00:07:10.399 ************************************ 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.399 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:10.657 { 00:07:10.657 "name": "Malloc2", 00:07:10.657 "aliases": [ 00:07:10.657 "50f3e352-a5af-4494-ab15-a2886def7b72" 00:07:10.657 ], 00:07:10.657 "product_name": "Malloc disk", 00:07:10.657 "block_size": 512, 00:07:10.657 "num_blocks": 16384, 00:07:10.657 "uuid": "50f3e352-a5af-4494-ab15-a2886def7b72", 00:07:10.657 "assigned_rate_limits": { 00:07:10.657 "rw_ios_per_sec": 0, 00:07:10.657 "rw_mbytes_per_sec": 0, 00:07:10.657 "r_mbytes_per_sec": 0, 00:07:10.657 "w_mbytes_per_sec": 0 00:07:10.657 }, 00:07:10.657 "claimed": false, 00:07:10.657 "zoned": false, 00:07:10.657 "supported_io_types": { 00:07:10.657 "read": true, 00:07:10.657 "write": true, 00:07:10.657 "unmap": true, 00:07:10.657 "flush": true, 00:07:10.657 "reset": true, 00:07:10.657 "nvme_admin": false, 00:07:10.657 "nvme_io": false, 00:07:10.657 "nvme_io_md": false, 00:07:10.657 "write_zeroes": true, 00:07:10.657 "zcopy": true, 00:07:10.657 "get_zone_info": false, 00:07:10.657 "zone_management": false, 00:07:10.657 "zone_append": false, 00:07:10.657 "compare": false, 00:07:10.657 "compare_and_write": false, 00:07:10.657 "abort": true, 00:07:10.657 "seek_hole": false, 00:07:10.657 "seek_data": false, 00:07:10.657 "copy": true, 00:07:10.657 "nvme_iov_md": false 00:07:10.657 }, 00:07:10.657 "memory_domains": [ 00:07:10.657 { 00:07:10.657 "dma_device_id": "system", 00:07:10.657 "dma_device_type": 1 00:07:10.657 }, 00:07:10.657 { 00:07:10.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:10.657 "dma_device_type": 2 00:07:10.657 } 00:07:10.657 ], 00:07:10.657 "driver_specific": {} 00:07:10.657 } 00:07:10.657 ]' 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.657 [2024-07-15 10:14:47.648146] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:07:10.657 [2024-07-15 10:14:47.648191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:10.657 [2024-07-15 10:14:47.648216] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2323b20 00:07:10.657 [2024-07-15 10:14:47.648229] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:10.657 [2024-07-15 10:14:47.649630] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:10.657 [2024-07-15 10:14:47.649656] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:10.657 Passthru0 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:10.657 { 00:07:10.657 "name": "Malloc2", 00:07:10.657 "aliases": [ 00:07:10.657 "50f3e352-a5af-4494-ab15-a2886def7b72" 00:07:10.657 ], 00:07:10.657 "product_name": "Malloc disk", 00:07:10.657 "block_size": 512, 00:07:10.657 "num_blocks": 16384, 00:07:10.657 "uuid": "50f3e352-a5af-4494-ab15-a2886def7b72", 00:07:10.657 "assigned_rate_limits": { 00:07:10.657 "rw_ios_per_sec": 0, 00:07:10.657 "rw_mbytes_per_sec": 0, 00:07:10.657 "r_mbytes_per_sec": 0, 00:07:10.657 "w_mbytes_per_sec": 0 00:07:10.657 }, 00:07:10.657 "claimed": true, 00:07:10.657 "claim_type": "exclusive_write", 00:07:10.657 "zoned": false, 00:07:10.657 "supported_io_types": { 00:07:10.657 "read": true, 00:07:10.657 "write": true, 00:07:10.657 "unmap": true, 00:07:10.657 "flush": true, 00:07:10.657 "reset": true, 00:07:10.657 "nvme_admin": false, 00:07:10.657 "nvme_io": false, 00:07:10.657 "nvme_io_md": false, 00:07:10.657 "write_zeroes": true, 00:07:10.657 "zcopy": true, 00:07:10.657 "get_zone_info": false, 00:07:10.657 "zone_management": false, 00:07:10.657 "zone_append": false, 00:07:10.657 "compare": false, 00:07:10.657 "compare_and_write": false, 00:07:10.657 "abort": true, 00:07:10.657 "seek_hole": false, 00:07:10.657 "seek_data": false, 00:07:10.657 "copy": true, 00:07:10.657 "nvme_iov_md": false 00:07:10.657 }, 00:07:10.657 "memory_domains": [ 00:07:10.657 { 00:07:10.657 "dma_device_id": "system", 00:07:10.657 "dma_device_type": 1 00:07:10.657 }, 00:07:10.657 { 00:07:10.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:10.657 "dma_device_type": 2 00:07:10.657 } 00:07:10.657 ], 00:07:10.657 "driver_specific": {} 00:07:10.657 }, 00:07:10.657 { 00:07:10.657 "name": "Passthru0", 00:07:10.657 "aliases": [ 00:07:10.657 "da2d3a8e-6df7-55fc-9db4-2b6d5a60b057" 00:07:10.657 ], 00:07:10.657 "product_name": "passthru", 00:07:10.657 "block_size": 512, 00:07:10.657 "num_blocks": 16384, 00:07:10.657 "uuid": "da2d3a8e-6df7-55fc-9db4-2b6d5a60b057", 00:07:10.657 "assigned_rate_limits": { 00:07:10.657 "rw_ios_per_sec": 0, 00:07:10.657 "rw_mbytes_per_sec": 0, 00:07:10.657 "r_mbytes_per_sec": 0, 00:07:10.657 "w_mbytes_per_sec": 0 00:07:10.657 }, 00:07:10.657 "claimed": false, 00:07:10.657 "zoned": false, 00:07:10.657 "supported_io_types": { 00:07:10.657 "read": true, 00:07:10.657 "write": true, 00:07:10.657 "unmap": true, 00:07:10.657 "flush": true, 00:07:10.657 "reset": true, 00:07:10.657 "nvme_admin": false, 00:07:10.657 "nvme_io": false, 00:07:10.657 "nvme_io_md": false, 00:07:10.657 "write_zeroes": true, 00:07:10.657 "zcopy": true, 00:07:10.657 "get_zone_info": false, 00:07:10.657 "zone_management": false, 00:07:10.657 "zone_append": false, 00:07:10.657 "compare": false, 00:07:10.657 "compare_and_write": false, 00:07:10.657 "abort": true, 00:07:10.657 "seek_hole": false, 00:07:10.657 "seek_data": false, 00:07:10.657 "copy": true, 00:07:10.657 "nvme_iov_md": false 00:07:10.657 }, 00:07:10.657 "memory_domains": [ 00:07:10.657 { 00:07:10.657 "dma_device_id": "system", 00:07:10.657 "dma_device_type": 1 00:07:10.657 }, 00:07:10.657 { 00:07:10.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:10.657 "dma_device_type": 2 00:07:10.657 } 00:07:10.657 ], 00:07:10.657 "driver_specific": { 00:07:10.657 "passthru": { 00:07:10.657 "name": "Passthru0", 00:07:10.657 "base_bdev_name": "Malloc2" 00:07:10.657 } 00:07:10.657 } 00:07:10.657 } 00:07:10.657 ]' 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.657 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:10.658 00:07:10.658 real 0m0.290s 00:07:10.658 user 0m0.187s 00:07:10.658 sys 0m0.053s 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:10.658 10:14:47 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:10.658 ************************************ 00:07:10.658 END TEST rpc_daemon_integrity 00:07:10.658 ************************************ 00:07:10.658 10:14:47 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:10.658 10:14:47 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:07:10.658 10:14:47 rpc -- rpc/rpc.sh@84 -- # killprocess 433683 00:07:10.658 10:14:47 rpc -- common/autotest_common.sh@948 -- # '[' -z 433683 ']' 00:07:10.658 10:14:47 rpc -- common/autotest_common.sh@952 -- # kill -0 433683 00:07:10.658 10:14:47 rpc -- common/autotest_common.sh@953 -- # uname 00:07:10.658 10:14:47 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:10.658 10:14:47 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 433683 00:07:10.915 10:14:47 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:10.915 10:14:47 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:10.915 10:14:47 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 433683' 00:07:10.915 killing process with pid 433683 00:07:10.915 10:14:47 rpc -- common/autotest_common.sh@967 -- # kill 433683 00:07:10.915 10:14:47 rpc -- common/autotest_common.sh@972 -- # wait 433683 00:07:11.173 00:07:11.173 real 0m2.848s 00:07:11.173 user 0m3.586s 00:07:11.173 sys 0m0.943s 00:07:11.173 10:14:48 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:11.173 10:14:48 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:11.173 ************************************ 00:07:11.173 END TEST rpc 00:07:11.173 ************************************ 00:07:11.173 10:14:48 -- common/autotest_common.sh@1142 -- # return 0 00:07:11.173 10:14:48 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:11.173 10:14:48 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:11.173 10:14:48 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.173 10:14:48 -- common/autotest_common.sh@10 -- # set +x 00:07:11.432 ************************************ 00:07:11.432 START TEST skip_rpc 00:07:11.432 ************************************ 00:07:11.432 10:14:48 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:11.432 * Looking for test storage... 00:07:11.432 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:11.432 10:14:48 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:11.432 10:14:48 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:11.432 10:14:48 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:07:11.432 10:14:48 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:11.432 10:14:48 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:11.432 10:14:48 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:11.432 ************************************ 00:07:11.432 START TEST skip_rpc 00:07:11.432 ************************************ 00:07:11.432 10:14:48 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:07:11.432 10:14:48 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=434230 00:07:11.432 10:14:48 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:11.432 10:14:48 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:07:11.432 10:14:48 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:07:11.432 [2024-07-15 10:14:48.593533] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:11.432 [2024-07-15 10:14:48.593606] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid434230 ] 00:07:11.690 [2024-07-15 10:14:48.724217] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.690 [2024-07-15 10:14:48.823717] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 434230 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 434230 ']' 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 434230 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 434230 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 434230' 00:07:16.972 killing process with pid 434230 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 434230 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 434230 00:07:16.972 00:07:16.972 real 0m5.456s 00:07:16.972 user 0m5.093s 00:07:16.972 sys 0m0.384s 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:16.972 10:14:53 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:16.972 ************************************ 00:07:16.972 END TEST skip_rpc 00:07:16.972 ************************************ 00:07:16.972 10:14:54 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:16.973 10:14:54 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:16.973 10:14:54 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:16.973 10:14:54 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:16.973 10:14:54 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:16.973 ************************************ 00:07:16.973 START TEST skip_rpc_with_json 00:07:16.973 ************************************ 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=434957 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 434957 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 434957 ']' 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:16.973 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:16.973 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:16.973 [2024-07-15 10:14:54.126860] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:16.973 [2024-07-15 10:14:54.126935] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid434957 ] 00:07:17.231 [2024-07-15 10:14:54.255352] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.231 [2024-07-15 10:14:54.361751] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:17.797 [2024-07-15 10:14:54.967598] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:17.797 request: 00:07:17.797 { 00:07:17.797 "trtype": "tcp", 00:07:17.797 "method": "nvmf_get_transports", 00:07:17.797 "req_id": 1 00:07:17.797 } 00:07:17.797 Got JSON-RPC error response 00:07:17.797 response: 00:07:17.797 { 00:07:17.797 "code": -19, 00:07:17.797 "message": "No such device" 00:07:17.797 } 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:17.797 [2024-07-15 10:14:54.975728] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:17.797 10:14:54 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:18.055 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:18.055 10:14:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:18.055 { 00:07:18.055 "subsystems": [ 00:07:18.055 { 00:07:18.055 "subsystem": "keyring", 00:07:18.055 "config": [] 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "subsystem": "iobuf", 00:07:18.055 "config": [ 00:07:18.055 { 00:07:18.055 "method": "iobuf_set_options", 00:07:18.055 "params": { 00:07:18.055 "small_pool_count": 8192, 00:07:18.055 "large_pool_count": 1024, 00:07:18.055 "small_bufsize": 8192, 00:07:18.055 "large_bufsize": 135168 00:07:18.055 } 00:07:18.055 } 00:07:18.055 ] 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "subsystem": "sock", 00:07:18.055 "config": [ 00:07:18.055 { 00:07:18.055 "method": "sock_set_default_impl", 00:07:18.055 "params": { 00:07:18.055 "impl_name": "posix" 00:07:18.055 } 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "method": "sock_impl_set_options", 00:07:18.055 "params": { 00:07:18.055 "impl_name": "ssl", 00:07:18.055 "recv_buf_size": 4096, 00:07:18.055 "send_buf_size": 4096, 00:07:18.055 "enable_recv_pipe": true, 00:07:18.055 "enable_quickack": false, 00:07:18.055 "enable_placement_id": 0, 00:07:18.055 "enable_zerocopy_send_server": true, 00:07:18.055 "enable_zerocopy_send_client": false, 00:07:18.055 "zerocopy_threshold": 0, 00:07:18.055 "tls_version": 0, 00:07:18.055 "enable_ktls": false 00:07:18.055 } 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "method": "sock_impl_set_options", 00:07:18.055 "params": { 00:07:18.055 "impl_name": "posix", 00:07:18.055 "recv_buf_size": 2097152, 00:07:18.055 "send_buf_size": 2097152, 00:07:18.055 "enable_recv_pipe": true, 00:07:18.055 "enable_quickack": false, 00:07:18.055 "enable_placement_id": 0, 00:07:18.055 "enable_zerocopy_send_server": true, 00:07:18.055 "enable_zerocopy_send_client": false, 00:07:18.055 "zerocopy_threshold": 0, 00:07:18.055 "tls_version": 0, 00:07:18.055 "enable_ktls": false 00:07:18.055 } 00:07:18.055 } 00:07:18.055 ] 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "subsystem": "vmd", 00:07:18.055 "config": [] 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "subsystem": "accel", 00:07:18.055 "config": [ 00:07:18.055 { 00:07:18.055 "method": "accel_set_options", 00:07:18.055 "params": { 00:07:18.055 "small_cache_size": 128, 00:07:18.055 "large_cache_size": 16, 00:07:18.055 "task_count": 2048, 00:07:18.055 "sequence_count": 2048, 00:07:18.055 "buf_count": 2048 00:07:18.055 } 00:07:18.055 } 00:07:18.055 ] 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "subsystem": "bdev", 00:07:18.055 "config": [ 00:07:18.055 { 00:07:18.055 "method": "bdev_set_options", 00:07:18.055 "params": { 00:07:18.055 "bdev_io_pool_size": 65535, 00:07:18.055 "bdev_io_cache_size": 256, 00:07:18.055 "bdev_auto_examine": true, 00:07:18.055 "iobuf_small_cache_size": 128, 00:07:18.055 "iobuf_large_cache_size": 16 00:07:18.055 } 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "method": "bdev_raid_set_options", 00:07:18.055 "params": { 00:07:18.055 "process_window_size_kb": 1024 00:07:18.055 } 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "method": "bdev_iscsi_set_options", 00:07:18.055 "params": { 00:07:18.055 "timeout_sec": 30 00:07:18.055 } 00:07:18.055 }, 00:07:18.055 { 00:07:18.055 "method": "bdev_nvme_set_options", 00:07:18.055 "params": { 00:07:18.055 "action_on_timeout": "none", 00:07:18.055 "timeout_us": 0, 00:07:18.055 "timeout_admin_us": 0, 00:07:18.055 "keep_alive_timeout_ms": 10000, 00:07:18.055 "arbitration_burst": 0, 00:07:18.055 "low_priority_weight": 0, 00:07:18.055 "medium_priority_weight": 0, 00:07:18.055 "high_priority_weight": 0, 00:07:18.055 "nvme_adminq_poll_period_us": 10000, 00:07:18.055 "nvme_ioq_poll_period_us": 0, 00:07:18.055 "io_queue_requests": 0, 00:07:18.055 "delay_cmd_submit": true, 00:07:18.055 "transport_retry_count": 4, 00:07:18.055 "bdev_retry_count": 3, 00:07:18.055 "transport_ack_timeout": 0, 00:07:18.055 "ctrlr_loss_timeout_sec": 0, 00:07:18.056 "reconnect_delay_sec": 0, 00:07:18.056 "fast_io_fail_timeout_sec": 0, 00:07:18.056 "disable_auto_failback": false, 00:07:18.056 "generate_uuids": false, 00:07:18.056 "transport_tos": 0, 00:07:18.056 "nvme_error_stat": false, 00:07:18.056 "rdma_srq_size": 0, 00:07:18.056 "io_path_stat": false, 00:07:18.056 "allow_accel_sequence": false, 00:07:18.056 "rdma_max_cq_size": 0, 00:07:18.056 "rdma_cm_event_timeout_ms": 0, 00:07:18.056 "dhchap_digests": [ 00:07:18.056 "sha256", 00:07:18.056 "sha384", 00:07:18.056 "sha512" 00:07:18.056 ], 00:07:18.056 "dhchap_dhgroups": [ 00:07:18.056 "null", 00:07:18.056 "ffdhe2048", 00:07:18.056 "ffdhe3072", 00:07:18.056 "ffdhe4096", 00:07:18.056 "ffdhe6144", 00:07:18.056 "ffdhe8192" 00:07:18.056 ] 00:07:18.056 } 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "method": "bdev_nvme_set_hotplug", 00:07:18.056 "params": { 00:07:18.056 "period_us": 100000, 00:07:18.056 "enable": false 00:07:18.056 } 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "method": "bdev_wait_for_examine" 00:07:18.056 } 00:07:18.056 ] 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "scsi", 00:07:18.056 "config": null 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "scheduler", 00:07:18.056 "config": [ 00:07:18.056 { 00:07:18.056 "method": "framework_set_scheduler", 00:07:18.056 "params": { 00:07:18.056 "name": "static" 00:07:18.056 } 00:07:18.056 } 00:07:18.056 ] 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "vhost_scsi", 00:07:18.056 "config": [] 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "vhost_blk", 00:07:18.056 "config": [] 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "ublk", 00:07:18.056 "config": [] 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "nbd", 00:07:18.056 "config": [] 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "nvmf", 00:07:18.056 "config": [ 00:07:18.056 { 00:07:18.056 "method": "nvmf_set_config", 00:07:18.056 "params": { 00:07:18.056 "discovery_filter": "match_any", 00:07:18.056 "admin_cmd_passthru": { 00:07:18.056 "identify_ctrlr": false 00:07:18.056 } 00:07:18.056 } 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "method": "nvmf_set_max_subsystems", 00:07:18.056 "params": { 00:07:18.056 "max_subsystems": 1024 00:07:18.056 } 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "method": "nvmf_set_crdt", 00:07:18.056 "params": { 00:07:18.056 "crdt1": 0, 00:07:18.056 "crdt2": 0, 00:07:18.056 "crdt3": 0 00:07:18.056 } 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "method": "nvmf_create_transport", 00:07:18.056 "params": { 00:07:18.056 "trtype": "TCP", 00:07:18.056 "max_queue_depth": 128, 00:07:18.056 "max_io_qpairs_per_ctrlr": 127, 00:07:18.056 "in_capsule_data_size": 4096, 00:07:18.056 "max_io_size": 131072, 00:07:18.056 "io_unit_size": 131072, 00:07:18.056 "max_aq_depth": 128, 00:07:18.056 "num_shared_buffers": 511, 00:07:18.056 "buf_cache_size": 4294967295, 00:07:18.056 "dif_insert_or_strip": false, 00:07:18.056 "zcopy": false, 00:07:18.056 "c2h_success": true, 00:07:18.056 "sock_priority": 0, 00:07:18.056 "abort_timeout_sec": 1, 00:07:18.056 "ack_timeout": 0, 00:07:18.056 "data_wr_pool_size": 0 00:07:18.056 } 00:07:18.056 } 00:07:18.056 ] 00:07:18.056 }, 00:07:18.056 { 00:07:18.056 "subsystem": "iscsi", 00:07:18.056 "config": [ 00:07:18.056 { 00:07:18.056 "method": "iscsi_set_options", 00:07:18.056 "params": { 00:07:18.056 "node_base": "iqn.2016-06.io.spdk", 00:07:18.056 "max_sessions": 128, 00:07:18.056 "max_connections_per_session": 2, 00:07:18.056 "max_queue_depth": 64, 00:07:18.056 "default_time2wait": 2, 00:07:18.056 "default_time2retain": 20, 00:07:18.056 "first_burst_length": 8192, 00:07:18.056 "immediate_data": true, 00:07:18.056 "allow_duplicated_isid": false, 00:07:18.056 "error_recovery_level": 0, 00:07:18.056 "nop_timeout": 60, 00:07:18.056 "nop_in_interval": 30, 00:07:18.056 "disable_chap": false, 00:07:18.056 "require_chap": false, 00:07:18.056 "mutual_chap": false, 00:07:18.056 "chap_group": 0, 00:07:18.056 "max_large_datain_per_connection": 64, 00:07:18.056 "max_r2t_per_connection": 4, 00:07:18.056 "pdu_pool_size": 36864, 00:07:18.056 "immediate_data_pool_size": 16384, 00:07:18.056 "data_out_pool_size": 2048 00:07:18.056 } 00:07:18.056 } 00:07:18.056 ] 00:07:18.056 } 00:07:18.056 ] 00:07:18.056 } 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 434957 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 434957 ']' 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 434957 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 434957 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 434957' 00:07:18.056 killing process with pid 434957 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 434957 00:07:18.056 10:14:55 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 434957 00:07:18.665 10:14:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=435145 00:07:18.665 10:14:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:18.665 10:14:55 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 435145 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 435145 ']' 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 435145 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 435145 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 435145' 00:07:23.933 killing process with pid 435145 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 435145 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 435145 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:23.933 00:07:23.933 real 0m6.938s 00:07:23.933 user 0m6.547s 00:07:23.933 sys 0m0.843s 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:23.933 10:15:00 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:23.933 ************************************ 00:07:23.933 END TEST skip_rpc_with_json 00:07:23.933 ************************************ 00:07:23.933 10:15:01 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:23.933 10:15:01 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:23.933 10:15:01 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:23.933 10:15:01 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:23.933 10:15:01 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:23.933 ************************************ 00:07:23.933 START TEST skip_rpc_with_delay 00:07:23.933 ************************************ 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:23.933 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:23.933 [2024-07-15 10:15:01.126791] app.c: 831:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:23.933 [2024-07-15 10:15:01.126859] app.c: 710:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:07:24.191 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:07:24.191 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:24.191 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:24.191 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:24.191 00:07:24.191 real 0m0.061s 00:07:24.191 user 0m0.034s 00:07:24.191 sys 0m0.026s 00:07:24.191 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:24.191 10:15:01 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:24.191 ************************************ 00:07:24.191 END TEST skip_rpc_with_delay 00:07:24.191 ************************************ 00:07:24.191 10:15:01 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:24.191 10:15:01 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:24.191 10:15:01 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:24.191 10:15:01 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:24.191 10:15:01 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:24.191 10:15:01 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:24.191 10:15:01 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:24.191 ************************************ 00:07:24.191 START TEST exit_on_failed_rpc_init 00:07:24.191 ************************************ 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=435979 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 435979 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 435979 ']' 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.191 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:24.191 10:15:01 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:24.191 [2024-07-15 10:15:01.293987] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:24.191 [2024-07-15 10:15:01.294059] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid435979 ] 00:07:24.450 [2024-07-15 10:15:01.423272] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.450 [2024-07-15 10:15:01.523764] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:25.015 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:25.016 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:25.016 [2024-07-15 10:15:02.199346] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:25.016 [2024-07-15 10:15:02.199413] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid436208 ] 00:07:25.273 [2024-07-15 10:15:02.317391] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.273 [2024-07-15 10:15:02.417443] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.273 [2024-07-15 10:15:02.417527] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:25.273 [2024-07-15 10:15:02.417544] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:25.273 [2024-07-15 10:15:02.417556] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 435979 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 435979 ']' 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 435979 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 435979 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 435979' 00:07:25.532 killing process with pid 435979 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 435979 00:07:25.532 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 435979 00:07:25.790 00:07:25.790 real 0m1.750s 00:07:25.790 user 0m1.964s 00:07:25.790 sys 0m0.595s 00:07:25.790 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:25.790 10:15:02 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:25.790 ************************************ 00:07:25.790 END TEST exit_on_failed_rpc_init 00:07:25.790 ************************************ 00:07:26.048 10:15:03 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:26.048 10:15:03 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:26.048 00:07:26.048 real 0m14.649s 00:07:26.048 user 0m13.813s 00:07:26.048 sys 0m2.152s 00:07:26.048 10:15:03 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.048 10:15:03 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:26.048 ************************************ 00:07:26.048 END TEST skip_rpc 00:07:26.048 ************************************ 00:07:26.048 10:15:03 -- common/autotest_common.sh@1142 -- # return 0 00:07:26.048 10:15:03 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:26.048 10:15:03 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:26.048 10:15:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.048 10:15:03 -- common/autotest_common.sh@10 -- # set +x 00:07:26.048 ************************************ 00:07:26.048 START TEST rpc_client 00:07:26.048 ************************************ 00:07:26.049 10:15:03 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:26.049 * Looking for test storage... 00:07:26.049 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:07:26.049 10:15:03 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:26.049 OK 00:07:26.049 10:15:03 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:26.049 00:07:26.049 real 0m0.138s 00:07:26.049 user 0m0.054s 00:07:26.049 sys 0m0.093s 00:07:26.049 10:15:03 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:26.049 10:15:03 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:26.049 ************************************ 00:07:26.049 END TEST rpc_client 00:07:26.049 ************************************ 00:07:26.307 10:15:03 -- common/autotest_common.sh@1142 -- # return 0 00:07:26.307 10:15:03 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:26.307 10:15:03 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:26.307 10:15:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:26.307 10:15:03 -- common/autotest_common.sh@10 -- # set +x 00:07:26.307 ************************************ 00:07:26.307 START TEST json_config 00:07:26.307 ************************************ 00:07:26.307 10:15:03 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:26.307 10:15:03 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:26.307 10:15:03 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:26.307 10:15:03 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:26.307 10:15:03 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:26.307 10:15:03 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:26.307 10:15:03 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:26.307 10:15:03 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:26.307 10:15:03 json_config -- paths/export.sh@5 -- # export PATH 00:07:26.307 10:15:03 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@47 -- # : 0 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:26.307 10:15:03 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:26.307 10:15:03 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:26.307 10:15:03 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:26.307 10:15:03 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:26.307 10:15:03 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:07:26.308 INFO: JSON configuration test init 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:26.308 10:15:03 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:07:26.308 10:15:03 json_config -- json_config/common.sh@9 -- # local app=target 00:07:26.308 10:15:03 json_config -- json_config/common.sh@10 -- # shift 00:07:26.308 10:15:03 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:26.308 10:15:03 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:26.308 10:15:03 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:26.308 10:15:03 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:26.308 10:15:03 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:26.308 10:15:03 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=436490 00:07:26.308 10:15:03 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:26.308 Waiting for target to run... 00:07:26.308 10:15:03 json_config -- json_config/common.sh@25 -- # waitforlisten 436490 /var/tmp/spdk_tgt.sock 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@829 -- # '[' -z 436490 ']' 00:07:26.308 10:15:03 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:26.308 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:26.308 10:15:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:26.566 [2024-07-15 10:15:03.514044] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:26.566 [2024-07-15 10:15:03.514117] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid436490 ] 00:07:27.133 [2024-07-15 10:15:04.117994] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.133 [2024-07-15 10:15:04.227450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.390 10:15:04 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:27.390 10:15:04 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:27.390 10:15:04 json_config -- json_config/common.sh@26 -- # echo '' 00:07:27.390 00:07:27.390 10:15:04 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:07:27.390 10:15:04 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:07:27.390 10:15:04 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:27.390 10:15:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:27.390 10:15:04 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:07:27.390 10:15:04 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:07:27.390 10:15:04 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:07:27.647 10:15:04 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:27.647 10:15:04 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:27.903 [2024-07-15 10:15:04.901614] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:27.903 10:15:04 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:27.903 10:15:04 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:28.159 [2024-07-15 10:15:05.142220] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:28.159 10:15:05 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:07:28.159 10:15:05 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:28.159 10:15:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:28.159 10:15:05 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:07:28.159 10:15:05 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:07:28.159 10:15:05 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:07:28.415 [2024-07-15 10:15:05.391545] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:30.938 10:15:07 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:07:30.938 10:15:07 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:07:30.938 10:15:07 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:30.938 10:15:07 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:30.938 10:15:07 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:07:30.938 10:15:07 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:07:30.938 10:15:07 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:07:30.938 10:15:07 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:07:30.938 10:15:07 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:07:30.938 10:15:07 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@48 -- # local get_types 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:07:31.195 10:15:08 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:31.195 10:15:08 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@55 -- # return 0 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:07:31.195 10:15:08 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:31.195 10:15:08 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:07:31.195 10:15:08 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:31.196 10:15:08 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:31.196 10:15:08 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:07:31.196 10:15:08 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:31.196 10:15:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:31.453 10:15:08 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:07:31.453 10:15:08 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:31.453 10:15:08 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:31.453 10:15:08 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:07:31.453 10:15:08 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:07:31.453 10:15:08 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:07:31.453 10:15:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:07:31.710 Nvme0n1p0 Nvme0n1p1 00:07:31.710 10:15:08 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:07:31.710 10:15:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:07:31.967 [2024-07-15 10:15:08.930649] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:31.967 [2024-07-15 10:15:08.930709] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:31.967 00:07:31.967 10:15:08 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:07:31.967 10:15:08 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:07:32.225 Malloc3 00:07:32.225 10:15:09 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:32.225 10:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:32.225 [2024-07-15 10:15:09.412033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:32.225 [2024-07-15 10:15:09.412085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:32.225 [2024-07-15 10:15:09.412112] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eaaa00 00:07:32.225 [2024-07-15 10:15:09.412125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:32.225 [2024-07-15 10:15:09.413742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:32.225 [2024-07-15 10:15:09.413772] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:32.225 PTBdevFromMalloc3 00:07:32.482 10:15:09 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:07:32.482 10:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:07:32.482 Null0 00:07:32.482 10:15:09 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:07:32.482 10:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:07:32.739 Malloc0 00:07:32.739 10:15:09 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:07:32.739 10:15:09 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:07:32.996 Malloc1 00:07:32.996 10:15:10 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:07:32.996 10:15:10 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:07:33.560 102400+0 records in 00:07:33.560 102400+0 records out 00:07:33.560 104857600 bytes (105 MB, 100 MiB) copied, 0.309398 s, 339 MB/s 00:07:33.560 10:15:10 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:07:33.560 10:15:10 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:07:33.560 aio_disk 00:07:33.560 10:15:10 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:07:33.560 10:15:10 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:33.560 10:15:10 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:38.811 cb158200-9412-41da-aa55-7c4afeb1768b 00:07:38.811 10:15:15 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:07:38.811 10:15:15 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:07:38.811 10:15:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:07:38.811 10:15:15 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:07:38.811 10:15:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:07:38.811 10:15:15 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:38.811 10:15:15 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:39.068 10:15:16 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:39.068 10:15:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:39.326 10:15:16 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:07:39.326 10:15:16 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:39.326 10:15:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:39.583 MallocForCryptoBdev 00:07:39.583 10:15:16 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:07:39.583 10:15:16 json_config -- json_config/json_config.sh@159 -- # wc -l 00:07:39.583 10:15:16 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:07:39.583 10:15:16 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:07:39.583 10:15:16 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:39.583 10:15:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:39.840 [2024-07-15 10:15:16.890967] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:07:39.840 CryptoMallocBdev 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:efc8e5ac-55e0-4426-bd11-2f380ebb04c0 bdev_register:edcf67ec-70aa-4f7f-978f-a4d52089a12e bdev_register:308e2e43-1287-4092-b600-d4f659d535ae bdev_register:16b55cdd-1608-42d8-b001-4decde872082 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:efc8e5ac-55e0-4426-bd11-2f380ebb04c0 bdev_register:edcf67ec-70aa-4f7f-978f-a4d52089a12e bdev_register:308e2e43-1287-4092-b600-d4f659d535ae bdev_register:16b55cdd-1608-42d8-b001-4decde872082 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@71 -- # sort 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@72 -- # sort 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:07:39.840 10:15:16 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:39.840 10:15:16 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:efc8e5ac-55e0-4426-bd11-2f380ebb04c0 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:edcf67ec-70aa-4f7f-978f-a4d52089a12e 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:308e2e43-1287-4092-b600-d4f659d535ae 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:16b55cdd-1608-42d8-b001-4decde872082 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:16b55cdd-1608-42d8-b001-4decde872082 bdev_register:308e2e43-1287-4092-b600-d4f659d535ae bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:edcf67ec-70aa-4f7f-978f-a4d52089a12e bdev_register:efc8e5ac-55e0-4426-bd11-2f380ebb04c0 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\1\6\b\5\5\c\d\d\-\1\6\0\8\-\4\2\d\8\-\b\0\0\1\-\4\d\e\c\d\e\8\7\2\0\8\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\0\8\e\2\e\4\3\-\1\2\8\7\-\4\0\9\2\-\b\6\0\0\-\d\4\f\6\5\9\d\5\3\5\a\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\e\d\c\f\6\7\e\c\-\7\0\a\a\-\4\f\7\f\-\9\7\8\f\-\a\4\d\5\2\0\8\9\a\1\2\e\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\e\f\c\8\e\5\a\c\-\5\5\e\0\-\4\4\2\6\-\b\d\1\1\-\2\f\3\8\0\e\b\b\0\4\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@86 -- # cat 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:16b55cdd-1608-42d8-b001-4decde872082 bdev_register:308e2e43-1287-4092-b600-d4f659d535ae bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:edcf67ec-70aa-4f7f-978f-a4d52089a12e bdev_register:efc8e5ac-55e0-4426-bd11-2f380ebb04c0 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:07:40.098 Expected events matched: 00:07:40.098 bdev_register:16b55cdd-1608-42d8-b001-4decde872082 00:07:40.098 bdev_register:308e2e43-1287-4092-b600-d4f659d535ae 00:07:40.098 bdev_register:aio_disk 00:07:40.098 bdev_register:CryptoMallocBdev 00:07:40.098 bdev_register:edcf67ec-70aa-4f7f-978f-a4d52089a12e 00:07:40.098 bdev_register:efc8e5ac-55e0-4426-bd11-2f380ebb04c0 00:07:40.098 bdev_register:Malloc0 00:07:40.098 bdev_register:Malloc0p0 00:07:40.098 bdev_register:Malloc0p1 00:07:40.098 bdev_register:Malloc0p2 00:07:40.098 bdev_register:Malloc1 00:07:40.098 bdev_register:Malloc3 00:07:40.098 bdev_register:MallocForCryptoBdev 00:07:40.098 bdev_register:Null0 00:07:40.098 bdev_register:Nvme0n1 00:07:40.098 bdev_register:Nvme0n1p0 00:07:40.098 bdev_register:Nvme0n1p1 00:07:40.098 bdev_register:PTBdevFromMalloc3 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:07:40.098 10:15:17 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:40.098 10:15:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:07:40.098 10:15:17 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:40.098 10:15:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:07:40.098 10:15:17 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:40.098 10:15:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:40.355 MallocBdevForConfigChangeCheck 00:07:40.355 10:15:17 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:07:40.355 10:15:17 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:40.355 10:15:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:40.612 10:15:17 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:07:40.613 10:15:17 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:40.869 10:15:17 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:07:40.869 INFO: shutting down applications... 00:07:40.869 10:15:17 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:07:40.869 10:15:17 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:07:40.870 10:15:17 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:07:40.870 10:15:17 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:07:41.127 [2024-07-15 10:15:18.130816] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:07:44.438 Calling clear_iscsi_subsystem 00:07:44.438 Calling clear_nvmf_subsystem 00:07:44.438 Calling clear_nbd_subsystem 00:07:44.438 Calling clear_ublk_subsystem 00:07:44.438 Calling clear_vhost_blk_subsystem 00:07:44.438 Calling clear_vhost_scsi_subsystem 00:07:44.438 Calling clear_bdev_subsystem 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@343 -- # count=100 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@345 -- # break 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:07:44.438 10:15:21 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:07:44.438 10:15:21 json_config -- json_config/common.sh@31 -- # local app=target 00:07:44.438 10:15:21 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:44.438 10:15:21 json_config -- json_config/common.sh@35 -- # [[ -n 436490 ]] 00:07:44.438 10:15:21 json_config -- json_config/common.sh@38 -- # kill -SIGINT 436490 00:07:44.438 10:15:21 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:44.438 10:15:21 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:44.438 10:15:21 json_config -- json_config/common.sh@41 -- # kill -0 436490 00:07:44.438 10:15:21 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:07:45.005 10:15:21 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:07:45.005 10:15:21 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:45.005 10:15:21 json_config -- json_config/common.sh@41 -- # kill -0 436490 00:07:45.005 10:15:21 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:45.005 10:15:21 json_config -- json_config/common.sh@43 -- # break 00:07:45.005 10:15:21 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:45.005 10:15:21 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:45.005 SPDK target shutdown done 00:07:45.005 10:15:21 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:07:45.005 INFO: relaunching applications... 00:07:45.005 10:15:21 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:45.005 10:15:21 json_config -- json_config/common.sh@9 -- # local app=target 00:07:45.005 10:15:21 json_config -- json_config/common.sh@10 -- # shift 00:07:45.005 10:15:21 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:45.005 10:15:21 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:45.005 10:15:21 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:45.005 10:15:21 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:45.005 10:15:21 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:45.005 10:15:21 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=439420 00:07:45.005 10:15:21 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:45.005 Waiting for target to run... 00:07:45.006 10:15:21 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:45.006 10:15:21 json_config -- json_config/common.sh@25 -- # waitforlisten 439420 /var/tmp/spdk_tgt.sock 00:07:45.006 10:15:21 json_config -- common/autotest_common.sh@829 -- # '[' -z 439420 ']' 00:07:45.006 10:15:21 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:45.006 10:15:21 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:45.006 10:15:21 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:45.006 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:45.006 10:15:21 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:45.006 10:15:21 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:45.006 [2024-07-15 10:15:22.045966] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:45.006 [2024-07-15 10:15:22.046043] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid439420 ] 00:07:45.572 [2024-07-15 10:15:22.670741] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.830 [2024-07-15 10:15:22.782112] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.830 [2024-07-15 10:15:22.836241] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:07:45.830 [2024-07-15 10:15:22.844278] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:45.830 [2024-07-15 10:15:22.852296] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:45.830 [2024-07-15 10:15:22.933507] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:48.354 [2024-07-15 10:15:25.144803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:48.355 [2024-07-15 10:15:25.144876] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:48.355 [2024-07-15 10:15:25.144891] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:48.355 [2024-07-15 10:15:25.152842] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:48.355 [2024-07-15 10:15:25.152870] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:48.355 [2024-07-15 10:15:25.160837] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:48.355 [2024-07-15 10:15:25.160862] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:48.355 [2024-07-15 10:15:25.168872] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:07:48.355 [2024-07-15 10:15:25.168899] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:07:48.355 [2024-07-15 10:15:25.168912] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:48.355 [2024-07-15 10:15:25.545348] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:48.355 [2024-07-15 10:15:25.545393] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:48.355 [2024-07-15 10:15:25.545410] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29ffb90 00:07:48.355 [2024-07-15 10:15:25.545423] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:48.355 [2024-07-15 10:15:25.545705] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:48.355 [2024-07-15 10:15:25.545725] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:48.612 10:15:25 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:48.612 10:15:25 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:48.612 10:15:25 json_config -- json_config/common.sh@26 -- # echo '' 00:07:48.612 00:07:48.612 10:15:25 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:07:48.612 10:15:25 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:48.612 INFO: Checking if target configuration is the same... 00:07:48.612 10:15:25 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:48.612 10:15:25 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:07:48.612 10:15:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:48.612 + '[' 2 -ne 2 ']' 00:07:48.613 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:48.613 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:48.613 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:48.613 +++ basename /dev/fd/62 00:07:48.613 ++ mktemp /tmp/62.XXX 00:07:48.613 + tmp_file_1=/tmp/62.BHv 00:07:48.613 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:48.613 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:48.613 + tmp_file_2=/tmp/spdk_tgt_config.json.xMz 00:07:48.613 + ret=0 00:07:48.613 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:48.870 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:49.127 + diff -u /tmp/62.BHv /tmp/spdk_tgt_config.json.xMz 00:07:49.127 + echo 'INFO: JSON config files are the same' 00:07:49.127 INFO: JSON config files are the same 00:07:49.127 + rm /tmp/62.BHv /tmp/spdk_tgt_config.json.xMz 00:07:49.127 + exit 0 00:07:49.127 10:15:26 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:07:49.127 10:15:26 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:49.127 INFO: changing configuration and checking if this can be detected... 00:07:49.127 10:15:26 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:49.127 10:15:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:49.385 10:15:26 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:49.385 10:15:26 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:07:49.385 10:15:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:49.385 + '[' 2 -ne 2 ']' 00:07:49.385 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:49.385 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:49.385 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:49.385 +++ basename /dev/fd/62 00:07:49.385 ++ mktemp /tmp/62.XXX 00:07:49.385 + tmp_file_1=/tmp/62.T5T 00:07:49.385 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:49.385 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:49.385 + tmp_file_2=/tmp/spdk_tgt_config.json.1UI 00:07:49.385 + ret=0 00:07:49.385 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:49.643 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:49.643 + diff -u /tmp/62.T5T /tmp/spdk_tgt_config.json.1UI 00:07:49.643 + ret=1 00:07:49.643 + echo '=== Start of file: /tmp/62.T5T ===' 00:07:49.643 + cat /tmp/62.T5T 00:07:49.643 + echo '=== End of file: /tmp/62.T5T ===' 00:07:49.643 + echo '' 00:07:49.643 + echo '=== Start of file: /tmp/spdk_tgt_config.json.1UI ===' 00:07:49.643 + cat /tmp/spdk_tgt_config.json.1UI 00:07:49.643 + echo '=== End of file: /tmp/spdk_tgt_config.json.1UI ===' 00:07:49.643 + echo '' 00:07:49.643 + rm /tmp/62.T5T /tmp/spdk_tgt_config.json.1UI 00:07:49.643 + exit 1 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:07:49.643 INFO: configuration change detected. 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:07:49.643 10:15:26 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:49.643 10:15:26 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@317 -- # [[ -n 439420 ]] 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:07:49.643 10:15:26 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:49.643 10:15:26 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:07:49.643 10:15:26 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:49.643 10:15:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:49.901 10:15:27 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:49.901 10:15:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:50.160 10:15:27 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:50.160 10:15:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:50.418 10:15:27 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:50.418 10:15:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:50.676 10:15:27 json_config -- json_config/json_config.sh@193 -- # uname -s 00:07:50.676 10:15:27 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:07:50.676 10:15:27 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:07:50.676 10:15:27 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:07:50.676 10:15:27 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:50.676 10:15:27 json_config -- json_config/json_config.sh@323 -- # killprocess 439420 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@948 -- # '[' -z 439420 ']' 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@952 -- # kill -0 439420 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@953 -- # uname 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 439420 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 439420' 00:07:50.676 killing process with pid 439420 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@967 -- # kill 439420 00:07:50.676 10:15:27 json_config -- common/autotest_common.sh@972 -- # wait 439420 00:07:53.953 10:15:31 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:53.953 10:15:31 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:07:53.953 10:15:31 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:53.953 10:15:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:53.953 10:15:31 json_config -- json_config/json_config.sh@328 -- # return 0 00:07:53.953 10:15:31 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:07:53.953 INFO: Success 00:07:53.953 00:07:53.953 real 0m27.804s 00:07:53.953 user 0m33.286s 00:07:53.953 sys 0m4.122s 00:07:53.953 10:15:31 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:53.953 10:15:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:53.953 ************************************ 00:07:53.953 END TEST json_config 00:07:53.953 ************************************ 00:07:54.211 10:15:31 -- common/autotest_common.sh@1142 -- # return 0 00:07:54.211 10:15:31 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:54.211 10:15:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:54.211 10:15:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.211 10:15:31 -- common/autotest_common.sh@10 -- # set +x 00:07:54.211 ************************************ 00:07:54.211 START TEST json_config_extra_key 00:07:54.211 ************************************ 00:07:54.211 10:15:31 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:54.211 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:54.211 10:15:31 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:54.211 10:15:31 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:54.211 10:15:31 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:54.211 10:15:31 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:54.211 10:15:31 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:54.211 10:15:31 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:54.211 10:15:31 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:54.211 10:15:31 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:54.211 10:15:31 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:54.211 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:54.211 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:54.211 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:54.211 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:54.211 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:54.211 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:54.212 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:54.212 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:54.212 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:54.212 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:54.212 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:54.212 INFO: launching applications... 00:07:54.212 10:15:31 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=440772 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:54.212 Waiting for target to run... 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 440772 /var/tmp/spdk_tgt.sock 00:07:54.212 10:15:31 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 440772 ']' 00:07:54.212 10:15:31 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:54.212 10:15:31 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:54.212 10:15:31 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:54.212 10:15:31 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:54.212 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:54.212 10:15:31 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:54.212 10:15:31 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:54.212 [2024-07-15 10:15:31.374349] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:54.212 [2024-07-15 10:15:31.374425] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid440772 ] 00:07:54.776 [2024-07-15 10:15:31.771228] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.776 [2024-07-15 10:15:31.861556] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.341 10:15:32 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:55.341 10:15:32 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:55.341 00:07:55.341 10:15:32 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:55.341 INFO: shutting down applications... 00:07:55.341 10:15:32 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 440772 ]] 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 440772 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 440772 00:07:55.341 10:15:32 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:55.908 10:15:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:55.908 10:15:32 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:55.908 10:15:32 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 440772 00:07:55.908 10:15:32 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:55.908 10:15:32 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:55.908 10:15:32 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:55.908 10:15:32 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:55.908 SPDK target shutdown done 00:07:55.908 10:15:32 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:55.908 Success 00:07:55.908 00:07:55.908 real 0m1.608s 00:07:55.908 user 0m1.265s 00:07:55.908 sys 0m0.524s 00:07:55.908 10:15:32 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:55.908 10:15:32 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:55.908 ************************************ 00:07:55.908 END TEST json_config_extra_key 00:07:55.908 ************************************ 00:07:55.908 10:15:32 -- common/autotest_common.sh@1142 -- # return 0 00:07:55.908 10:15:32 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:55.908 10:15:32 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:55.908 10:15:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:55.908 10:15:32 -- common/autotest_common.sh@10 -- # set +x 00:07:55.908 ************************************ 00:07:55.908 START TEST alias_rpc 00:07:55.908 ************************************ 00:07:55.908 10:15:32 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:55.908 * Looking for test storage... 00:07:55.908 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:55.908 10:15:32 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:55.908 10:15:33 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=441053 00:07:55.908 10:15:33 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:55.908 10:15:33 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 441053 00:07:55.908 10:15:33 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 441053 ']' 00:07:55.908 10:15:33 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:55.908 10:15:33 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:55.908 10:15:33 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:55.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:55.908 10:15:33 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:55.908 10:15:33 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:55.908 [2024-07-15 10:15:33.060569] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:55.908 [2024-07-15 10:15:33.060635] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid441053 ] 00:07:56.165 [2024-07-15 10:15:33.175481] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.165 [2024-07-15 10:15:33.276951] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.095 10:15:33 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:57.095 10:15:33 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:07:57.096 10:15:33 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:57.096 10:15:34 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 441053 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 441053 ']' 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 441053 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 441053 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 441053' 00:07:57.096 killing process with pid 441053 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@967 -- # kill 441053 00:07:57.096 10:15:34 alias_rpc -- common/autotest_common.sh@972 -- # wait 441053 00:07:57.660 00:07:57.660 real 0m1.749s 00:07:57.660 user 0m1.876s 00:07:57.660 sys 0m0.556s 00:07:57.660 10:15:34 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:57.660 10:15:34 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:57.660 ************************************ 00:07:57.660 END TEST alias_rpc 00:07:57.660 ************************************ 00:07:57.660 10:15:34 -- common/autotest_common.sh@1142 -- # return 0 00:07:57.660 10:15:34 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:57.660 10:15:34 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:57.660 10:15:34 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:57.660 10:15:34 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:57.661 10:15:34 -- common/autotest_common.sh@10 -- # set +x 00:07:57.661 ************************************ 00:07:57.661 START TEST spdkcli_tcp 00:07:57.661 ************************************ 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:57.661 * Looking for test storage... 00:07:57.661 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=441291 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 441291 00:07:57.661 10:15:34 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 441291 ']' 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:57.661 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:57.661 10:15:34 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:57.918 [2024-07-15 10:15:34.902719] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:57.918 [2024-07-15 10:15:34.902794] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid441291 ] 00:07:57.918 [2024-07-15 10:15:35.032424] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:58.176 [2024-07-15 10:15:35.132794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:58.176 [2024-07-15 10:15:35.132798] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.741 10:15:35 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:58.741 10:15:35 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:07:58.741 10:15:35 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=441466 00:07:58.741 10:15:35 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:58.741 10:15:35 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:58.999 [ 00:07:58.999 "bdev_malloc_delete", 00:07:58.999 "bdev_malloc_create", 00:07:58.999 "bdev_null_resize", 00:07:58.999 "bdev_null_delete", 00:07:58.999 "bdev_null_create", 00:07:58.999 "bdev_nvme_cuse_unregister", 00:07:58.999 "bdev_nvme_cuse_register", 00:07:58.999 "bdev_opal_new_user", 00:07:58.999 "bdev_opal_set_lock_state", 00:07:58.999 "bdev_opal_delete", 00:07:58.999 "bdev_opal_get_info", 00:07:58.999 "bdev_opal_create", 00:07:58.999 "bdev_nvme_opal_revert", 00:07:58.999 "bdev_nvme_opal_init", 00:07:58.999 "bdev_nvme_send_cmd", 00:07:58.999 "bdev_nvme_get_path_iostat", 00:07:58.999 "bdev_nvme_get_mdns_discovery_info", 00:07:58.999 "bdev_nvme_stop_mdns_discovery", 00:07:58.999 "bdev_nvme_start_mdns_discovery", 00:07:58.999 "bdev_nvme_set_multipath_policy", 00:07:58.999 "bdev_nvme_set_preferred_path", 00:07:58.999 "bdev_nvme_get_io_paths", 00:07:58.999 "bdev_nvme_remove_error_injection", 00:07:58.999 "bdev_nvme_add_error_injection", 00:07:58.999 "bdev_nvme_get_discovery_info", 00:07:58.999 "bdev_nvme_stop_discovery", 00:07:58.999 "bdev_nvme_start_discovery", 00:07:58.999 "bdev_nvme_get_controller_health_info", 00:07:58.999 "bdev_nvme_disable_controller", 00:07:58.999 "bdev_nvme_enable_controller", 00:07:58.999 "bdev_nvme_reset_controller", 00:07:58.999 "bdev_nvme_get_transport_statistics", 00:07:58.999 "bdev_nvme_apply_firmware", 00:07:58.999 "bdev_nvme_detach_controller", 00:07:58.999 "bdev_nvme_get_controllers", 00:07:58.999 "bdev_nvme_attach_controller", 00:07:58.999 "bdev_nvme_set_hotplug", 00:07:58.999 "bdev_nvme_set_options", 00:07:58.999 "bdev_passthru_delete", 00:07:58.999 "bdev_passthru_create", 00:07:58.999 "bdev_lvol_set_parent_bdev", 00:07:58.999 "bdev_lvol_set_parent", 00:07:58.999 "bdev_lvol_check_shallow_copy", 00:07:58.999 "bdev_lvol_start_shallow_copy", 00:07:58.999 "bdev_lvol_grow_lvstore", 00:07:58.999 "bdev_lvol_get_lvols", 00:07:58.999 "bdev_lvol_get_lvstores", 00:07:58.999 "bdev_lvol_delete", 00:07:58.999 "bdev_lvol_set_read_only", 00:07:58.999 "bdev_lvol_resize", 00:07:58.999 "bdev_lvol_decouple_parent", 00:07:58.999 "bdev_lvol_inflate", 00:07:58.999 "bdev_lvol_rename", 00:07:58.999 "bdev_lvol_clone_bdev", 00:07:58.999 "bdev_lvol_clone", 00:07:58.999 "bdev_lvol_snapshot", 00:07:58.999 "bdev_lvol_create", 00:07:58.999 "bdev_lvol_delete_lvstore", 00:07:58.999 "bdev_lvol_rename_lvstore", 00:07:58.999 "bdev_lvol_create_lvstore", 00:07:58.999 "bdev_raid_set_options", 00:07:58.999 "bdev_raid_remove_base_bdev", 00:07:58.999 "bdev_raid_add_base_bdev", 00:07:58.999 "bdev_raid_delete", 00:07:58.999 "bdev_raid_create", 00:07:58.999 "bdev_raid_get_bdevs", 00:07:58.999 "bdev_error_inject_error", 00:07:58.999 "bdev_error_delete", 00:07:58.999 "bdev_error_create", 00:07:58.999 "bdev_split_delete", 00:07:58.999 "bdev_split_create", 00:07:58.999 "bdev_delay_delete", 00:07:58.999 "bdev_delay_create", 00:07:58.999 "bdev_delay_update_latency", 00:07:58.999 "bdev_zone_block_delete", 00:07:58.999 "bdev_zone_block_create", 00:07:58.999 "blobfs_create", 00:07:58.999 "blobfs_detect", 00:07:58.999 "blobfs_set_cache_size", 00:07:58.999 "bdev_crypto_delete", 00:07:58.999 "bdev_crypto_create", 00:07:58.999 "bdev_compress_delete", 00:07:58.999 "bdev_compress_create", 00:07:58.999 "bdev_compress_get_orphans", 00:07:58.999 "bdev_aio_delete", 00:07:58.999 "bdev_aio_rescan", 00:07:58.999 "bdev_aio_create", 00:07:58.999 "bdev_ftl_set_property", 00:07:58.999 "bdev_ftl_get_properties", 00:07:58.999 "bdev_ftl_get_stats", 00:07:58.999 "bdev_ftl_unmap", 00:07:58.999 "bdev_ftl_unload", 00:07:58.999 "bdev_ftl_delete", 00:07:58.999 "bdev_ftl_load", 00:07:58.999 "bdev_ftl_create", 00:07:58.999 "bdev_virtio_attach_controller", 00:07:58.999 "bdev_virtio_scsi_get_devices", 00:07:58.999 "bdev_virtio_detach_controller", 00:07:58.999 "bdev_virtio_blk_set_hotplug", 00:07:58.999 "bdev_iscsi_delete", 00:07:58.999 "bdev_iscsi_create", 00:07:58.999 "bdev_iscsi_set_options", 00:07:58.999 "accel_error_inject_error", 00:07:58.999 "ioat_scan_accel_module", 00:07:58.999 "dsa_scan_accel_module", 00:07:58.999 "iaa_scan_accel_module", 00:07:58.999 "dpdk_cryptodev_get_driver", 00:07:58.999 "dpdk_cryptodev_set_driver", 00:07:58.999 "dpdk_cryptodev_scan_accel_module", 00:07:59.000 "compressdev_scan_accel_module", 00:07:59.000 "keyring_file_remove_key", 00:07:59.000 "keyring_file_add_key", 00:07:59.000 "keyring_linux_set_options", 00:07:59.000 "iscsi_get_histogram", 00:07:59.000 "iscsi_enable_histogram", 00:07:59.000 "iscsi_set_options", 00:07:59.000 "iscsi_get_auth_groups", 00:07:59.000 "iscsi_auth_group_remove_secret", 00:07:59.000 "iscsi_auth_group_add_secret", 00:07:59.000 "iscsi_delete_auth_group", 00:07:59.000 "iscsi_create_auth_group", 00:07:59.000 "iscsi_set_discovery_auth", 00:07:59.000 "iscsi_get_options", 00:07:59.000 "iscsi_target_node_request_logout", 00:07:59.000 "iscsi_target_node_set_redirect", 00:07:59.000 "iscsi_target_node_set_auth", 00:07:59.000 "iscsi_target_node_add_lun", 00:07:59.000 "iscsi_get_stats", 00:07:59.000 "iscsi_get_connections", 00:07:59.000 "iscsi_portal_group_set_auth", 00:07:59.000 "iscsi_start_portal_group", 00:07:59.000 "iscsi_delete_portal_group", 00:07:59.000 "iscsi_create_portal_group", 00:07:59.000 "iscsi_get_portal_groups", 00:07:59.000 "iscsi_delete_target_node", 00:07:59.000 "iscsi_target_node_remove_pg_ig_maps", 00:07:59.000 "iscsi_target_node_add_pg_ig_maps", 00:07:59.000 "iscsi_create_target_node", 00:07:59.000 "iscsi_get_target_nodes", 00:07:59.000 "iscsi_delete_initiator_group", 00:07:59.000 "iscsi_initiator_group_remove_initiators", 00:07:59.000 "iscsi_initiator_group_add_initiators", 00:07:59.000 "iscsi_create_initiator_group", 00:07:59.000 "iscsi_get_initiator_groups", 00:07:59.000 "nvmf_set_crdt", 00:07:59.000 "nvmf_set_config", 00:07:59.000 "nvmf_set_max_subsystems", 00:07:59.000 "nvmf_stop_mdns_prr", 00:07:59.000 "nvmf_publish_mdns_prr", 00:07:59.000 "nvmf_subsystem_get_listeners", 00:07:59.000 "nvmf_subsystem_get_qpairs", 00:07:59.000 "nvmf_subsystem_get_controllers", 00:07:59.000 "nvmf_get_stats", 00:07:59.000 "nvmf_get_transports", 00:07:59.000 "nvmf_create_transport", 00:07:59.000 "nvmf_get_targets", 00:07:59.000 "nvmf_delete_target", 00:07:59.000 "nvmf_create_target", 00:07:59.000 "nvmf_subsystem_allow_any_host", 00:07:59.000 "nvmf_subsystem_remove_host", 00:07:59.000 "nvmf_subsystem_add_host", 00:07:59.000 "nvmf_ns_remove_host", 00:07:59.000 "nvmf_ns_add_host", 00:07:59.000 "nvmf_subsystem_remove_ns", 00:07:59.000 "nvmf_subsystem_add_ns", 00:07:59.000 "nvmf_subsystem_listener_set_ana_state", 00:07:59.000 "nvmf_discovery_get_referrals", 00:07:59.000 "nvmf_discovery_remove_referral", 00:07:59.000 "nvmf_discovery_add_referral", 00:07:59.000 "nvmf_subsystem_remove_listener", 00:07:59.000 "nvmf_subsystem_add_listener", 00:07:59.000 "nvmf_delete_subsystem", 00:07:59.000 "nvmf_create_subsystem", 00:07:59.000 "nvmf_get_subsystems", 00:07:59.000 "env_dpdk_get_mem_stats", 00:07:59.000 "nbd_get_disks", 00:07:59.000 "nbd_stop_disk", 00:07:59.000 "nbd_start_disk", 00:07:59.000 "ublk_recover_disk", 00:07:59.000 "ublk_get_disks", 00:07:59.000 "ublk_stop_disk", 00:07:59.000 "ublk_start_disk", 00:07:59.000 "ublk_destroy_target", 00:07:59.000 "ublk_create_target", 00:07:59.000 "virtio_blk_create_transport", 00:07:59.000 "virtio_blk_get_transports", 00:07:59.000 "vhost_controller_set_coalescing", 00:07:59.000 "vhost_get_controllers", 00:07:59.000 "vhost_delete_controller", 00:07:59.000 "vhost_create_blk_controller", 00:07:59.000 "vhost_scsi_controller_remove_target", 00:07:59.000 "vhost_scsi_controller_add_target", 00:07:59.000 "vhost_start_scsi_controller", 00:07:59.000 "vhost_create_scsi_controller", 00:07:59.000 "thread_set_cpumask", 00:07:59.000 "framework_get_governor", 00:07:59.000 "framework_get_scheduler", 00:07:59.000 "framework_set_scheduler", 00:07:59.000 "framework_get_reactors", 00:07:59.000 "thread_get_io_channels", 00:07:59.000 "thread_get_pollers", 00:07:59.000 "thread_get_stats", 00:07:59.000 "framework_monitor_context_switch", 00:07:59.000 "spdk_kill_instance", 00:07:59.000 "log_enable_timestamps", 00:07:59.000 "log_get_flags", 00:07:59.000 "log_clear_flag", 00:07:59.000 "log_set_flag", 00:07:59.000 "log_get_level", 00:07:59.000 "log_set_level", 00:07:59.000 "log_get_print_level", 00:07:59.000 "log_set_print_level", 00:07:59.000 "framework_enable_cpumask_locks", 00:07:59.000 "framework_disable_cpumask_locks", 00:07:59.000 "framework_wait_init", 00:07:59.000 "framework_start_init", 00:07:59.000 "scsi_get_devices", 00:07:59.000 "bdev_get_histogram", 00:07:59.000 "bdev_enable_histogram", 00:07:59.000 "bdev_set_qos_limit", 00:07:59.000 "bdev_set_qd_sampling_period", 00:07:59.000 "bdev_get_bdevs", 00:07:59.000 "bdev_reset_iostat", 00:07:59.000 "bdev_get_iostat", 00:07:59.000 "bdev_examine", 00:07:59.000 "bdev_wait_for_examine", 00:07:59.000 "bdev_set_options", 00:07:59.000 "notify_get_notifications", 00:07:59.000 "notify_get_types", 00:07:59.000 "accel_get_stats", 00:07:59.000 "accel_set_options", 00:07:59.000 "accel_set_driver", 00:07:59.000 "accel_crypto_key_destroy", 00:07:59.000 "accel_crypto_keys_get", 00:07:59.000 "accel_crypto_key_create", 00:07:59.000 "accel_assign_opc", 00:07:59.000 "accel_get_module_info", 00:07:59.000 "accel_get_opc_assignments", 00:07:59.000 "vmd_rescan", 00:07:59.000 "vmd_remove_device", 00:07:59.000 "vmd_enable", 00:07:59.000 "sock_get_default_impl", 00:07:59.000 "sock_set_default_impl", 00:07:59.000 "sock_impl_set_options", 00:07:59.000 "sock_impl_get_options", 00:07:59.000 "iobuf_get_stats", 00:07:59.000 "iobuf_set_options", 00:07:59.000 "framework_get_pci_devices", 00:07:59.000 "framework_get_config", 00:07:59.000 "framework_get_subsystems", 00:07:59.000 "trace_get_info", 00:07:59.000 "trace_get_tpoint_group_mask", 00:07:59.000 "trace_disable_tpoint_group", 00:07:59.000 "trace_enable_tpoint_group", 00:07:59.000 "trace_clear_tpoint_mask", 00:07:59.000 "trace_set_tpoint_mask", 00:07:59.000 "keyring_get_keys", 00:07:59.000 "spdk_get_version", 00:07:59.000 "rpc_get_methods" 00:07:59.000 ] 00:07:59.000 10:15:36 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:59.000 10:15:36 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:59.000 10:15:36 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 441291 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 441291 ']' 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 441291 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 441291 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 441291' 00:07:59.000 killing process with pid 441291 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 441291 00:07:59.000 10:15:36 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 441291 00:07:59.565 00:07:59.565 real 0m1.804s 00:07:59.565 user 0m3.215s 00:07:59.565 sys 0m0.623s 00:07:59.565 10:15:36 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:59.565 10:15:36 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:59.565 ************************************ 00:07:59.565 END TEST spdkcli_tcp 00:07:59.565 ************************************ 00:07:59.565 10:15:36 -- common/autotest_common.sh@1142 -- # return 0 00:07:59.565 10:15:36 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:59.565 10:15:36 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:59.565 10:15:36 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:59.565 10:15:36 -- common/autotest_common.sh@10 -- # set +x 00:07:59.565 ************************************ 00:07:59.565 START TEST dpdk_mem_utility 00:07:59.565 ************************************ 00:07:59.565 10:15:36 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:59.565 * Looking for test storage... 00:07:59.565 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:59.565 10:15:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:59.565 10:15:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=441635 00:07:59.565 10:15:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 441635 00:07:59.565 10:15:36 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:59.565 10:15:36 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 441635 ']' 00:07:59.565 10:15:36 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:59.565 10:15:36 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:59.565 10:15:36 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:59.565 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:59.565 10:15:36 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:59.565 10:15:36 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:59.565 [2024-07-15 10:15:36.745563] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:07:59.565 [2024-07-15 10:15:36.745640] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid441635 ] 00:07:59.823 [2024-07-15 10:15:36.875896] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.823 [2024-07-15 10:15:36.973151] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.756 10:15:37 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:00.756 10:15:37 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:08:00.756 10:15:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:08:00.756 10:15:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:08:00.756 10:15:37 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:00.756 10:15:37 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:00.756 { 00:08:00.756 "filename": "/tmp/spdk_mem_dump.txt" 00:08:00.756 } 00:08:00.756 10:15:37 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:00.756 10:15:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:00.756 DPDK memory size 816.000000 MiB in 2 heap(s) 00:08:00.756 2 heaps totaling size 816.000000 MiB 00:08:00.756 size: 814.000000 MiB heap id: 0 00:08:00.756 size: 2.000000 MiB heap id: 1 00:08:00.756 end heaps---------- 00:08:00.756 8 mempools totaling size 598.116089 MiB 00:08:00.756 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:08:00.756 size: 158.602051 MiB name: PDU_data_out_Pool 00:08:00.756 size: 84.521057 MiB name: bdev_io_441635 00:08:00.756 size: 51.011292 MiB name: evtpool_441635 00:08:00.756 size: 50.003479 MiB name: msgpool_441635 00:08:00.756 size: 21.763794 MiB name: PDU_Pool 00:08:00.756 size: 19.513306 MiB name: SCSI_TASK_Pool 00:08:00.756 size: 0.026123 MiB name: Session_Pool 00:08:00.756 end mempools------- 00:08:00.756 201 memzones totaling size 4.176453 MiB 00:08:00.756 size: 1.000366 MiB name: RG_ring_0_441635 00:08:00.756 size: 1.000366 MiB name: RG_ring_1_441635 00:08:00.756 size: 1.000366 MiB name: RG_ring_4_441635 00:08:00.756 size: 1.000366 MiB name: RG_ring_5_441635 00:08:00.756 size: 0.125366 MiB name: RG_ring_2_441635 00:08:00.756 size: 0.015991 MiB name: RG_ring_3_441635 00:08:00.756 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:08:00.756 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.0_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.1_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.2_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.3_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.4_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.5_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.6_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:01.7_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.0_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.1_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.2_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.3_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.4_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.5_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.6_qat 00:08:00.756 size: 0.000305 MiB name: 0000:da:02.7_qat 00:08:00.756 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_0 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_1 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_2 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_3 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_4 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_5 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_6 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_7 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_8 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_9 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_10 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_11 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_12 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_13 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_14 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_15 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_16 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_17 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_18 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:00.756 size: 0.000122 MiB name: rte_compressdev_data_19 00:08:00.756 size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_20 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_21 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_22 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_23 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_24 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_25 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_26 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_27 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_28 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_29 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_30 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_31 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_64 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_65 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_32 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_66 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_67 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_33 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_68 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_69 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_34 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_70 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_71 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_35 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_72 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_73 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_36 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_74 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_75 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_37 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_76 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_77 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_38 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_78 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_79 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_39 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_80 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_81 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_40 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_82 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_83 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_41 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_84 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_85 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_42 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_86 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_87 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_43 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_88 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_89 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_44 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_90 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_91 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_45 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_92 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_93 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_46 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_94 00:08:00.757 size: 0.000122 MiB name: rte_cryptodev_data_95 00:08:00.757 size: 0.000122 MiB name: rte_compressdev_data_47 00:08:00.757 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:00.757 end memzones------- 00:08:00.757 10:15:37 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:08:01.021 heap id: 0 total size: 814.000000 MiB number of busy elements: 514 number of free elements: 14 00:08:01.021 list of free elements. size: 11.815735 MiB 00:08:01.021 element at address: 0x200000400000 with size: 1.999512 MiB 00:08:01.021 element at address: 0x200018e00000 with size: 0.999878 MiB 00:08:01.021 element at address: 0x200019000000 with size: 0.999878 MiB 00:08:01.021 element at address: 0x200003e00000 with size: 0.996460 MiB 00:08:01.021 element at address: 0x200031c00000 with size: 0.994446 MiB 00:08:01.021 element at address: 0x200013800000 with size: 0.978882 MiB 00:08:01.021 element at address: 0x200007000000 with size: 0.960022 MiB 00:08:01.021 element at address: 0x200019200000 with size: 0.937256 MiB 00:08:01.021 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:08:01.021 element at address: 0x200003a00000 with size: 0.498535 MiB 00:08:01.021 element at address: 0x20000b200000 with size: 0.491272 MiB 00:08:01.021 element at address: 0x200000800000 with size: 0.486328 MiB 00:08:01.021 element at address: 0x200019400000 with size: 0.485840 MiB 00:08:01.021 element at address: 0x200027e00000 with size: 0.404175 MiB 00:08:01.021 list of standard malloc elements. size: 199.875977 MiB 00:08:01.021 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:08:01.021 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:08:01.021 element at address: 0x200018efff80 with size: 1.000122 MiB 00:08:01.021 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:08:01.021 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:08:01.021 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:08:01.021 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:08:01.021 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:08:01.021 element at address: 0x200000330b40 with size: 0.004395 MiB 00:08:01.021 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000337640 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000033e140 with size: 0.004395 MiB 00:08:01.021 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000344c40 with size: 0.004395 MiB 00:08:01.021 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000034b740 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000352240 with size: 0.004395 MiB 00:08:01.021 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000358d40 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000035f840 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000366880 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000036a340 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000036de00 with size: 0.004395 MiB 00:08:01.021 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000375380 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000378e40 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000037c900 with size: 0.004395 MiB 00:08:01.021 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000383e80 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000387940 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000038b400 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000392980 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000396440 with size: 0.004395 MiB 00:08:01.021 element at address: 0x200000399f00 with size: 0.004395 MiB 00:08:01.021 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:08:01.022 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:08:01.022 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000333040 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000335540 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000339b40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000033c040 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000340640 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000342b40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000347140 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000349640 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000350140 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000354740 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000356c40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000035b240 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000035d740 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000361d40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000364780 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000365800 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000368240 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000370840 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000373280 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000374300 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000376d40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000037a800 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000037b880 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000037f340 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000381d80 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000382e00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000385840 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000389300 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000038a380 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000038de40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000390880 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000391900 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000394340 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000397e00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000398e80 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000039c940 with size: 0.004028 MiB 00:08:01.022 element at address: 0x20000039f380 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:08:01.022 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:08:01.022 element at address: 0x200000204ec0 with size: 0.000305 MiB 00:08:01.022 element at address: 0x200000200000 with size: 0.000183 MiB 00:08:01.022 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:08:01.022 element at address: 0x200000200180 with size: 0.000183 MiB 00:08:01.022 element at address: 0x200000200240 with size: 0.000183 MiB 00:08:01.022 element at address: 0x200000200300 with size: 0.000183 MiB 00:08:01.022 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200480 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200540 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200600 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200780 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200840 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200900 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200a80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200b40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200c00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200d80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200e40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200f00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201080 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201140 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201200 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201380 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201440 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201500 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201680 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201740 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201800 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201980 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201a40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201b00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201c80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201d40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201e00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000201f80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202040 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202100 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202280 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202340 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202400 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202580 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202640 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202700 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202880 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202940 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202a00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202b80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202c40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202d00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202e80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000202f40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203000 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203180 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203240 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203300 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203480 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203540 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203600 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203780 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203840 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203900 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203a80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203b40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203c00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203d80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203e40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203f00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204080 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204140 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204200 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002042c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204380 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204440 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204500 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002045c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204680 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204740 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204800 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002048c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204980 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204a40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204b00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204bc0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204c80 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204d40 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000204e00 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205000 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205180 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205240 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205300 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205480 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205540 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205600 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205780 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205840 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205900 with size: 0.000183 MiB 00:08:01.023 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:08:01.023 element at address: 0x200000205a80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000205b40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000205c00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000205d80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000205e40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000205f00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000206080 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000206140 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000206200 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000020a780 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022af80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b040 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b100 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b280 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b340 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b400 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b580 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b640 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b700 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b900 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022be40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022c080 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022c140 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022c200 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022c380 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022c440 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000022c500 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000032e700 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000331d40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000338840 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000033f340 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000345e40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000034c940 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000353440 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000359f40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000360a40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000364180 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000364240 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000364400 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000367a80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000367c40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000367d00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036b540 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036b700 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036b980 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036f000 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036f280 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000036f440 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000372c80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000372d40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000372f00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000376580 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000376740 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000376800 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037a040 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037a200 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037a480 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037db00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000037df40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000381780 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000381840 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000381a00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000385080 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000385240 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000385300 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000388b40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000388d00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000388f80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000038c600 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000038c880 with size: 0.000183 MiB 00:08:01.024 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000390280 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000390340 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000390500 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000393b80 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000393d40 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000393e00 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000397640 with size: 0.000183 MiB 00:08:01.024 element at address: 0x200000397800 with size: 0.000183 MiB 00:08:01.024 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200000397a80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039b100 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039b380 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039b540 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000039f000 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087c800 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087c980 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e67780 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e67840 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6e440 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:08:01.025 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:08:01.025 list of memzone associated elements. size: 602.308289 MiB 00:08:01.025 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:08:01.025 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:08:01.026 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:08:01.026 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:08:01.026 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:08:01.026 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_441635_0 00:08:01.026 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:08:01.026 associated memzone info: size: 48.002930 MiB name: MP_evtpool_441635_0 00:08:01.026 element at address: 0x200003fff380 with size: 48.003052 MiB 00:08:01.026 associated memzone info: size: 48.002930 MiB name: MP_msgpool_441635_0 00:08:01.026 element at address: 0x2000195be940 with size: 20.255554 MiB 00:08:01.026 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:08:01.026 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:08:01.026 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:08:01.026 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:08:01.026 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_441635 00:08:01.026 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:08:01.026 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_441635 00:08:01.026 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:08:01.026 associated memzone info: size: 1.007996 MiB name: MP_evtpool_441635 00:08:01.026 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:08:01.026 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:08:01.026 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:08:01.026 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:08:01.026 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:08:01.026 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:08:01.026 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:08:01.026 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:08:01.026 element at address: 0x200003eff180 with size: 1.000488 MiB 00:08:01.026 associated memzone info: size: 1.000366 MiB name: RG_ring_0_441635 00:08:01.026 element at address: 0x200003affc00 with size: 1.000488 MiB 00:08:01.026 associated memzone info: size: 1.000366 MiB name: RG_ring_1_441635 00:08:01.026 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:08:01.026 associated memzone info: size: 1.000366 MiB name: RG_ring_4_441635 00:08:01.026 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:08:01.026 associated memzone info: size: 1.000366 MiB name: RG_ring_5_441635 00:08:01.026 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:08:01.026 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_441635 00:08:01.026 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:08:01.026 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:08:01.026 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:08:01.026 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:08:01.026 element at address: 0x20001947c600 with size: 0.250488 MiB 00:08:01.026 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:08:01.026 element at address: 0x20000020a840 with size: 0.125488 MiB 00:08:01.026 associated memzone info: size: 0.125366 MiB name: RG_ring_2_441635 00:08:01.026 element at address: 0x2000070f5c40 with size: 0.031738 MiB 00:08:01.026 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:08:01.026 element at address: 0x200027e67900 with size: 0.023743 MiB 00:08:01.026 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:08:01.026 element at address: 0x200000206580 with size: 0.016113 MiB 00:08:01.026 associated memzone info: size: 0.015991 MiB name: RG_ring_3_441635 00:08:01.026 element at address: 0x200027e6da40 with size: 0.002441 MiB 00:08:01.026 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:08:01.026 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:08:01.026 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:01.026 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:08:01.026 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:08:01.026 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:08:01.026 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:08:01.026 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:08:01.026 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:08:01.026 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:08:01.026 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:08:01.026 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:08:01.026 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:08:01.026 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:08:01.026 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:08:01.026 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:08:01.026 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:08:01.026 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:08:01.026 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:08:01.026 element at address: 0x20000039b700 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:08:01.026 element at address: 0x200000397c40 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:08:01.026 element at address: 0x200000394180 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:08:01.026 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:08:01.026 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:08:01.026 element at address: 0x200000389140 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:08:01.026 element at address: 0x200000385680 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:08:01.026 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:08:01.026 element at address: 0x20000037e100 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:08:01.026 element at address: 0x20000037a640 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:08:01.026 element at address: 0x200000376b80 with size: 0.000427 MiB 00:08:01.026 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:08:01.026 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:08:01.027 element at address: 0x20000036f600 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:08:01.027 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:08:01.027 element at address: 0x200000368080 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:08:01.027 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:08:01.027 element at address: 0x200000360b00 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:08:01.027 element at address: 0x20000035d580 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:08:01.027 element at address: 0x20000035a000 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:08:01.027 element at address: 0x200000356a80 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:08:01.027 element at address: 0x200000353500 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:08:01.027 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:08:01.027 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:08:01.027 element at address: 0x200000349480 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:08:01.027 element at address: 0x200000345f00 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:08:01.027 element at address: 0x200000342980 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:08:01.027 element at address: 0x20000033f400 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:08:01.027 element at address: 0x20000033be80 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:08:01.027 element at address: 0x200000338900 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:08:01.027 element at address: 0x200000335380 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:08:01.027 element at address: 0x200000331e00 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:08:01.027 element at address: 0x20000032e880 with size: 0.000427 MiB 00:08:01.027 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:08:01.027 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:08:01.027 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:01.027 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:08:01.027 associated memzone info: size: 0.000183 MiB name: MP_msgpool_441635 00:08:01.027 element at address: 0x200000206380 with size: 0.000305 MiB 00:08:01.027 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_441635 00:08:01.027 element at address: 0x200027e6e500 with size: 0.000305 MiB 00:08:01.027 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:08:01.027 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:01.027 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:01.027 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:08:01.027 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:01.027 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:01.027 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:08:01.027 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:01.027 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:01.027 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:08:01.027 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:01.027 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:01.027 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:08:01.027 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:01.027 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:01.027 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:08:01.027 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:08:01.027 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:01.027 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:01.028 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:08:01.028 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:01.028 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:01.028 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:08:01.028 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:01.028 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:01.028 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:08:01.028 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:01.028 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:01.028 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:08:01.028 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:01.028 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:01.028 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:08:01.028 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:01.028 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:01.028 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:08:01.028 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:01.028 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:01.028 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:08:01.028 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:01.028 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:01.028 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:08:01.028 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:01.028 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:01.028 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:08:01.028 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:01.028 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:01.028 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:08:01.028 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:01.028 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:01.028 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:08:01.028 element at address: 0x20000039b600 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:01.028 element at address: 0x20000039b440 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:01.028 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:08:01.028 element at address: 0x200000397b40 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:01.028 element at address: 0x200000397980 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:01.028 element at address: 0x200000397700 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:08:01.028 element at address: 0x200000394080 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:01.028 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:01.028 element at address: 0x200000393c40 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:08:01.028 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:01.028 element at address: 0x200000390400 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:01.028 element at address: 0x200000390180 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:08:01.028 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:01.028 element at address: 0x20000038c940 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:01.028 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:08:01.028 element at address: 0x200000389040 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:01.028 element at address: 0x200000388e80 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:01.028 element at address: 0x200000388c00 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:08:01.028 element at address: 0x200000385580 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:01.028 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:01.028 element at address: 0x200000385140 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:08:01.028 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:01.028 element at address: 0x200000381900 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:01.028 element at address: 0x200000381680 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:08:01.028 element at address: 0x20000037e000 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:01.028 element at address: 0x20000037de40 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:01.028 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:08:01.028 element at address: 0x20000037a540 with size: 0.000244 MiB 00:08:01.028 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:01.029 element at address: 0x20000037a380 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:01.029 element at address: 0x20000037a100 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:08:01.029 element at address: 0x200000376a80 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:01.029 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:01.029 element at address: 0x200000376640 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:08:01.029 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:01.029 element at address: 0x200000372e00 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:01.029 element at address: 0x200000372b80 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:08:01.029 element at address: 0x20000036f500 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:01.029 element at address: 0x20000036f340 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:01.029 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:08:01.029 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:01.029 element at address: 0x20000036b880 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:01.029 element at address: 0x20000036b600 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:08:01.029 element at address: 0x200000367f80 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:01.029 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:01.029 element at address: 0x200000367b40 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:08:01.029 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:01.029 element at address: 0x200000364300 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:01.029 element at address: 0x200000364080 with size: 0.000244 MiB 00:08:01.029 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:08:01.029 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:08:01.029 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:01.029 10:15:38 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:08:01.029 10:15:38 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 441635 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 441635 ']' 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 441635 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 441635 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 441635' 00:08:01.029 killing process with pid 441635 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 441635 00:08:01.029 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 441635 00:08:01.594 00:08:01.594 real 0m1.961s 00:08:01.594 user 0m2.306s 00:08:01.594 sys 0m0.585s 00:08:01.594 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:01.594 10:15:38 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:01.594 ************************************ 00:08:01.594 END TEST dpdk_mem_utility 00:08:01.594 ************************************ 00:08:01.594 10:15:38 -- common/autotest_common.sh@1142 -- # return 0 00:08:01.594 10:15:38 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:01.594 10:15:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:01.594 10:15:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.594 10:15:38 -- common/autotest_common.sh@10 -- # set +x 00:08:01.594 ************************************ 00:08:01.594 START TEST event 00:08:01.594 ************************************ 00:08:01.594 10:15:38 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:01.594 * Looking for test storage... 00:08:01.594 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:08:01.594 10:15:38 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:01.594 10:15:38 event -- bdev/nbd_common.sh@6 -- # set -e 00:08:01.594 10:15:38 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:01.594 10:15:38 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:01.594 10:15:38 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.594 10:15:38 event -- common/autotest_common.sh@10 -- # set +x 00:08:01.594 ************************************ 00:08:01.594 START TEST event_perf 00:08:01.594 ************************************ 00:08:01.594 10:15:38 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:01.594 Running I/O for 1 seconds...[2024-07-15 10:15:38.792007] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:01.594 [2024-07-15 10:15:38.792067] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid441947 ] 00:08:01.852 [2024-07-15 10:15:38.922416] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:01.852 [2024-07-15 10:15:39.025788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:01.852 [2024-07-15 10:15:39.025872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:01.852 [2024-07-15 10:15:39.025899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:01.852 [2024-07-15 10:15:39.025902] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.272 Running I/O for 1 seconds... 00:08:03.272 lcore 0: 179702 00:08:03.272 lcore 1: 179700 00:08:03.272 lcore 2: 179699 00:08:03.272 lcore 3: 179701 00:08:03.272 done. 00:08:03.272 00:08:03.272 real 0m1.359s 00:08:03.272 user 0m4.209s 00:08:03.272 sys 0m0.141s 00:08:03.272 10:15:40 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:03.272 10:15:40 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:08:03.272 ************************************ 00:08:03.272 END TEST event_perf 00:08:03.272 ************************************ 00:08:03.272 10:15:40 event -- common/autotest_common.sh@1142 -- # return 0 00:08:03.272 10:15:40 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:03.272 10:15:40 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:03.272 10:15:40 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:03.272 10:15:40 event -- common/autotest_common.sh@10 -- # set +x 00:08:03.272 ************************************ 00:08:03.272 START TEST event_reactor 00:08:03.272 ************************************ 00:08:03.272 10:15:40 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:03.272 [2024-07-15 10:15:40.236535] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:03.272 [2024-07-15 10:15:40.236598] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid442146 ] 00:08:03.272 [2024-07-15 10:15:40.367264] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.531 [2024-07-15 10:15:40.471303] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.464 test_start 00:08:04.464 oneshot 00:08:04.464 tick 100 00:08:04.464 tick 100 00:08:04.464 tick 250 00:08:04.464 tick 100 00:08:04.464 tick 100 00:08:04.464 tick 250 00:08:04.464 tick 100 00:08:04.464 tick 500 00:08:04.464 tick 100 00:08:04.464 tick 100 00:08:04.464 tick 250 00:08:04.464 tick 100 00:08:04.464 tick 100 00:08:04.464 test_end 00:08:04.464 00:08:04.464 real 0m1.355s 00:08:04.464 user 0m1.204s 00:08:04.464 sys 0m0.144s 00:08:04.464 10:15:41 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:04.464 10:15:41 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:08:04.464 ************************************ 00:08:04.464 END TEST event_reactor 00:08:04.464 ************************************ 00:08:04.464 10:15:41 event -- common/autotest_common.sh@1142 -- # return 0 00:08:04.464 10:15:41 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:04.464 10:15:41 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:04.464 10:15:41 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:04.464 10:15:41 event -- common/autotest_common.sh@10 -- # set +x 00:08:04.464 ************************************ 00:08:04.464 START TEST event_reactor_perf 00:08:04.464 ************************************ 00:08:04.464 10:15:41 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:04.721 [2024-07-15 10:15:41.663590] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:04.721 [2024-07-15 10:15:41.663648] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid442346 ] 00:08:04.721 [2024-07-15 10:15:41.790700] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:04.721 [2024-07-15 10:15:41.893056] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.092 test_start 00:08:06.092 test_end 00:08:06.092 Performance: 328495 events per second 00:08:06.092 00:08:06.092 real 0m1.346s 00:08:06.092 user 0m1.204s 00:08:06.092 sys 0m0.135s 00:08:06.092 10:15:42 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:06.092 10:15:42 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:08:06.092 ************************************ 00:08:06.092 END TEST event_reactor_perf 00:08:06.092 ************************************ 00:08:06.092 10:15:43 event -- common/autotest_common.sh@1142 -- # return 0 00:08:06.092 10:15:43 event -- event/event.sh@49 -- # uname -s 00:08:06.092 10:15:43 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:08:06.092 10:15:43 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:06.092 10:15:43 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:06.092 10:15:43 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:06.092 10:15:43 event -- common/autotest_common.sh@10 -- # set +x 00:08:06.092 ************************************ 00:08:06.092 START TEST event_scheduler 00:08:06.092 ************************************ 00:08:06.092 10:15:43 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:06.092 * Looking for test storage... 00:08:06.092 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:08:06.092 10:15:43 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:08:06.092 10:15:43 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=442567 00:08:06.092 10:15:43 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:08:06.092 10:15:43 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:08:06.092 10:15:43 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 442567 00:08:06.092 10:15:43 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 442567 ']' 00:08:06.092 10:15:43 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:06.092 10:15:43 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:06.092 10:15:43 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:06.092 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:06.092 10:15:43 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:06.092 10:15:43 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:06.092 [2024-07-15 10:15:43.228068] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:06.092 [2024-07-15 10:15:43.228142] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid442567 ] 00:08:06.349 [2024-07-15 10:15:43.332008] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:06.349 [2024-07-15 10:15:43.417284] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.349 [2024-07-15 10:15:43.417363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:06.349 [2024-07-15 10:15:43.417439] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:06.349 [2024-07-15 10:15:43.417441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:08:07.282 10:15:44 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:07.282 [2024-07-15 10:15:44.116120] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:08:07.282 [2024-07-15 10:15:44.116143] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:08:07.282 [2024-07-15 10:15:44.116159] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:08:07.282 [2024-07-15 10:15:44.116171] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:08:07.282 [2024-07-15 10:15:44.116181] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.282 10:15:44 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:07.282 [2024-07-15 10:15:44.208728] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.282 10:15:44 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:07.282 10:15:44 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:07.282 ************************************ 00:08:07.282 START TEST scheduler_create_thread 00:08:07.282 ************************************ 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.282 2 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.282 3 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.282 4 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:08:07.282 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.283 5 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.283 6 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.283 7 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.283 8 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.283 9 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.283 10 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.283 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:07.849 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:07.849 10:15:44 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:08:07.849 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:07.849 10:15:44 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:09.222 10:15:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:09.222 10:15:46 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:08:09.222 10:15:46 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:08:09.222 10:15:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:09.222 10:15:46 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:10.154 10:15:47 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:10.154 00:08:10.154 real 0m3.100s 00:08:10.154 user 0m0.024s 00:08:10.154 sys 0m0.007s 00:08:10.154 10:15:47 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.154 10:15:47 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:10.154 ************************************ 00:08:10.154 END TEST scheduler_create_thread 00:08:10.154 ************************************ 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:08:10.411 10:15:47 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:08:10.411 10:15:47 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 442567 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 442567 ']' 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 442567 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 442567 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 442567' 00:08:10.411 killing process with pid 442567 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 442567 00:08:10.411 10:15:47 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 442567 00:08:10.669 [2024-07-15 10:15:47.732144] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:08:10.927 00:08:10.927 real 0m4.917s 00:08:10.927 user 0m9.498s 00:08:10.927 sys 0m0.508s 00:08:10.927 10:15:47 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:10.927 10:15:47 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:10.927 ************************************ 00:08:10.927 END TEST event_scheduler 00:08:10.927 ************************************ 00:08:10.927 10:15:48 event -- common/autotest_common.sh@1142 -- # return 0 00:08:10.927 10:15:48 event -- event/event.sh@51 -- # modprobe -n nbd 00:08:10.927 10:15:48 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:08:10.927 10:15:48 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:10.927 10:15:48 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:10.927 10:15:48 event -- common/autotest_common.sh@10 -- # set +x 00:08:10.927 ************************************ 00:08:10.927 START TEST app_repeat 00:08:10.927 ************************************ 00:08:10.927 10:15:48 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@19 -- # repeat_pid=443315 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 443315' 00:08:10.927 Process app_repeat pid: 443315 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:08:10.927 spdk_app_start Round 0 00:08:10.927 10:15:48 event.app_repeat -- event/event.sh@25 -- # waitforlisten 443315 /var/tmp/spdk-nbd.sock 00:08:10.927 10:15:48 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 443315 ']' 00:08:10.927 10:15:48 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:10.927 10:15:48 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:10.927 10:15:48 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:10.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:10.927 10:15:48 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:10.927 10:15:48 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:10.927 [2024-07-15 10:15:48.117158] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:10.928 [2024-07-15 10:15:48.117227] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid443315 ] 00:08:11.186 [2024-07-15 10:15:48.248777] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:11.186 [2024-07-15 10:15:48.351284] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:11.186 [2024-07-15 10:15:48.351289] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.120 10:15:49 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:12.120 10:15:49 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:12.120 10:15:49 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:12.120 Malloc0 00:08:12.378 10:15:49 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:12.378 Malloc1 00:08:12.378 10:15:49 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:12.378 10:15:49 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:12.636 /dev/nbd0 00:08:12.636 10:15:49 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:12.636 10:15:49 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:12.636 1+0 records in 00:08:12.636 1+0 records out 00:08:12.636 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224469 s, 18.2 MB/s 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:12.636 10:15:49 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:12.636 10:15:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:12.636 10:15:49 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:12.636 10:15:49 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:13.201 /dev/nbd1 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:13.201 1+0 records in 00:08:13.201 1+0 records out 00:08:13.201 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240161 s, 17.1 MB/s 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:13.201 10:15:50 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:13.201 10:15:50 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:13.201 { 00:08:13.201 "nbd_device": "/dev/nbd0", 00:08:13.201 "bdev_name": "Malloc0" 00:08:13.201 }, 00:08:13.201 { 00:08:13.201 "nbd_device": "/dev/nbd1", 00:08:13.201 "bdev_name": "Malloc1" 00:08:13.201 } 00:08:13.201 ]' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:13.460 { 00:08:13.460 "nbd_device": "/dev/nbd0", 00:08:13.460 "bdev_name": "Malloc0" 00:08:13.460 }, 00:08:13.460 { 00:08:13.460 "nbd_device": "/dev/nbd1", 00:08:13.460 "bdev_name": "Malloc1" 00:08:13.460 } 00:08:13.460 ]' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:13.460 /dev/nbd1' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:13.460 /dev/nbd1' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:13.460 256+0 records in 00:08:13.460 256+0 records out 00:08:13.460 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011456 s, 91.5 MB/s 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:13.460 256+0 records in 00:08:13.460 256+0 records out 00:08:13.460 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.019289 s, 54.4 MB/s 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:13.460 256+0 records in 00:08:13.460 256+0 records out 00:08:13.460 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0203165 s, 51.6 MB/s 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:13.460 10:15:50 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:13.718 10:15:50 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:13.977 10:15:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:14.235 10:15:51 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:14.235 10:15:51 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:14.493 10:15:51 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:14.752 [2024-07-15 10:15:51.841255] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:14.752 [2024-07-15 10:15:51.939447] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:14.752 [2024-07-15 10:15:51.939453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.011 [2024-07-15 10:15:51.991698] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:15.011 [2024-07-15 10:15:51.991751] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:17.537 10:15:54 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:17.537 10:15:54 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:08:17.537 spdk_app_start Round 1 00:08:17.537 10:15:54 event.app_repeat -- event/event.sh@25 -- # waitforlisten 443315 /var/tmp/spdk-nbd.sock 00:08:17.537 10:15:54 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 443315 ']' 00:08:17.537 10:15:54 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:17.537 10:15:54 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:17.537 10:15:54 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:17.537 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:17.537 10:15:54 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:17.537 10:15:54 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:17.795 10:15:54 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:17.795 10:15:54 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:17.795 10:15:54 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:18.052 Malloc0 00:08:18.052 10:15:55 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:18.310 Malloc1 00:08:18.310 10:15:55 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:18.310 10:15:55 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:18.568 /dev/nbd0 00:08:18.568 10:15:55 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:18.569 10:15:55 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:18.569 1+0 records in 00:08:18.569 1+0 records out 00:08:18.569 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000161739 s, 25.3 MB/s 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:18.569 10:15:55 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:18.569 10:15:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:18.569 10:15:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:18.569 10:15:55 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:18.826 /dev/nbd1 00:08:18.827 10:15:55 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:18.827 10:15:55 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:18.827 1+0 records in 00:08:18.827 1+0 records out 00:08:18.827 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258692 s, 15.8 MB/s 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:18.827 10:15:55 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:18.827 10:15:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:18.827 10:15:55 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:18.827 10:15:55 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:18.827 10:15:55 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:18.827 10:15:55 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:19.085 { 00:08:19.085 "nbd_device": "/dev/nbd0", 00:08:19.085 "bdev_name": "Malloc0" 00:08:19.085 }, 00:08:19.085 { 00:08:19.085 "nbd_device": "/dev/nbd1", 00:08:19.085 "bdev_name": "Malloc1" 00:08:19.085 } 00:08:19.085 ]' 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:19.085 { 00:08:19.085 "nbd_device": "/dev/nbd0", 00:08:19.085 "bdev_name": "Malloc0" 00:08:19.085 }, 00:08:19.085 { 00:08:19.085 "nbd_device": "/dev/nbd1", 00:08:19.085 "bdev_name": "Malloc1" 00:08:19.085 } 00:08:19.085 ]' 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:19.085 /dev/nbd1' 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:19.085 /dev/nbd1' 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:19.085 256+0 records in 00:08:19.085 256+0 records out 00:08:19.085 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109267 s, 96.0 MB/s 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:19.085 256+0 records in 00:08:19.085 256+0 records out 00:08:19.085 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0186017 s, 56.4 MB/s 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:19.085 10:15:56 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:19.085 256+0 records in 00:08:19.085 256+0 records out 00:08:19.085 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0236377 s, 44.4 MB/s 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:19.344 10:15:56 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:19.602 10:15:56 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:19.861 10:15:56 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:20.118 10:15:57 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:20.118 10:15:57 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:20.376 10:15:57 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:20.635 [2024-07-15 10:15:57.672213] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:20.635 [2024-07-15 10:15:57.770147] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:20.635 [2024-07-15 10:15:57.770153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.635 [2024-07-15 10:15:57.817137] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:20.635 [2024-07-15 10:15:57.817188] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:23.944 10:16:00 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:23.944 10:16:00 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:08:23.944 spdk_app_start Round 2 00:08:23.944 10:16:00 event.app_repeat -- event/event.sh@25 -- # waitforlisten 443315 /var/tmp/spdk-nbd.sock 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 443315 ']' 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:23.944 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:23.944 10:16:00 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:23.944 10:16:00 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:23.944 Malloc0 00:08:23.944 10:16:00 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:24.202 Malloc1 00:08:24.202 10:16:01 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:24.202 10:16:01 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:24.460 /dev/nbd0 00:08:24.460 10:16:01 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:24.460 10:16:01 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:24.460 1+0 records in 00:08:24.460 1+0 records out 00:08:24.460 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228021 s, 18.0 MB/s 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:24.460 10:16:01 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:24.460 10:16:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:24.460 10:16:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:24.460 10:16:01 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:24.718 /dev/nbd1 00:08:24.718 10:16:01 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:24.718 10:16:01 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:24.718 1+0 records in 00:08:24.718 1+0 records out 00:08:24.718 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283622 s, 14.4 MB/s 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:24.718 10:16:01 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:24.718 10:16:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:24.718 10:16:01 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:24.718 10:16:01 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:24.718 10:16:01 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:24.718 10:16:01 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:24.976 10:16:01 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:24.976 { 00:08:24.976 "nbd_device": "/dev/nbd0", 00:08:24.976 "bdev_name": "Malloc0" 00:08:24.976 }, 00:08:24.976 { 00:08:24.976 "nbd_device": "/dev/nbd1", 00:08:24.976 "bdev_name": "Malloc1" 00:08:24.976 } 00:08:24.976 ]' 00:08:24.976 10:16:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:24.976 { 00:08:24.976 "nbd_device": "/dev/nbd0", 00:08:24.976 "bdev_name": "Malloc0" 00:08:24.976 }, 00:08:24.976 { 00:08:24.976 "nbd_device": "/dev/nbd1", 00:08:24.976 "bdev_name": "Malloc1" 00:08:24.976 } 00:08:24.976 ]' 00:08:24.976 10:16:01 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:24.976 /dev/nbd1' 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:24.976 /dev/nbd1' 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:24.976 256+0 records in 00:08:24.976 256+0 records out 00:08:24.976 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105571 s, 99.3 MB/s 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:24.976 256+0 records in 00:08:24.976 256+0 records out 00:08:24.976 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0263943 s, 39.7 MB/s 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:24.976 256+0 records in 00:08:24.976 256+0 records out 00:08:24.976 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200324 s, 52.3 MB/s 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:24.976 10:16:02 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:24.977 10:16:02 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:25.235 10:16:02 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:25.573 10:16:02 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:25.573 10:16:02 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:25.573 10:16:02 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:25.573 10:16:02 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:25.573 10:16:02 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.573 10:16:02 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:25.573 10:16:02 event.app_repeat -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:25.832 10:16:02 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:26.091 10:16:03 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:26.091 10:16:03 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:26.349 10:16:03 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:26.606 [2024-07-15 10:16:03.620167] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:26.606 [2024-07-15 10:16:03.719345] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:26.606 [2024-07-15 10:16:03.719350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.606 [2024-07-15 10:16:03.771483] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:26.606 [2024-07-15 10:16:03.771538] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:29.882 10:16:06 event.app_repeat -- event/event.sh@38 -- # waitforlisten 443315 /var/tmp/spdk-nbd.sock 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 443315 ']' 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:29.882 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:29.882 10:16:06 event.app_repeat -- event/event.sh@39 -- # killprocess 443315 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 443315 ']' 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 443315 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 443315 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 443315' 00:08:29.882 killing process with pid 443315 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@967 -- # kill 443315 00:08:29.882 10:16:06 event.app_repeat -- common/autotest_common.sh@972 -- # wait 443315 00:08:29.882 spdk_app_start is called in Round 0. 00:08:29.882 Shutdown signal received, stop current app iteration 00:08:29.882 Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 reinitialization... 00:08:29.882 spdk_app_start is called in Round 1. 00:08:29.882 Shutdown signal received, stop current app iteration 00:08:29.882 Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 reinitialization... 00:08:29.882 spdk_app_start is called in Round 2. 00:08:29.882 Shutdown signal received, stop current app iteration 00:08:29.882 Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 reinitialization... 00:08:29.882 spdk_app_start is called in Round 3. 00:08:29.883 Shutdown signal received, stop current app iteration 00:08:29.883 10:16:06 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:08:29.883 10:16:06 event.app_repeat -- event/event.sh@42 -- # return 0 00:08:29.883 00:08:29.883 real 0m18.797s 00:08:29.883 user 0m40.592s 00:08:29.883 sys 0m3.908s 00:08:29.883 10:16:06 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.883 10:16:06 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:29.883 ************************************ 00:08:29.883 END TEST app_repeat 00:08:29.883 ************************************ 00:08:29.883 10:16:06 event -- common/autotest_common.sh@1142 -- # return 0 00:08:29.883 10:16:06 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:08:29.883 00:08:29.883 real 0m28.294s 00:08:29.883 user 0m56.900s 00:08:29.883 sys 0m5.197s 00:08:29.883 10:16:06 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.883 10:16:06 event -- common/autotest_common.sh@10 -- # set +x 00:08:29.883 ************************************ 00:08:29.883 END TEST event 00:08:29.883 ************************************ 00:08:29.883 10:16:06 -- common/autotest_common.sh@1142 -- # return 0 00:08:29.883 10:16:06 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:08:29.883 10:16:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:29.883 10:16:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.883 10:16:06 -- common/autotest_common.sh@10 -- # set +x 00:08:29.883 ************************************ 00:08:29.883 START TEST thread 00:08:29.883 ************************************ 00:08:29.883 10:16:06 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:08:30.141 * Looking for test storage... 00:08:30.141 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:08:30.141 10:16:07 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:30.141 10:16:07 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:30.141 10:16:07 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:30.141 10:16:07 thread -- common/autotest_common.sh@10 -- # set +x 00:08:30.141 ************************************ 00:08:30.141 START TEST thread_poller_perf 00:08:30.141 ************************************ 00:08:30.141 10:16:07 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:30.141 [2024-07-15 10:16:07.168126] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:30.141 [2024-07-15 10:16:07.168190] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid446020 ] 00:08:30.141 [2024-07-15 10:16:07.296111] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.400 [2024-07-15 10:16:07.397094] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.400 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:08:31.334 ====================================== 00:08:31.334 busy:2311229768 (cyc) 00:08:31.334 total_run_count: 265000 00:08:31.334 tsc_hz: 2300000000 (cyc) 00:08:31.334 ====================================== 00:08:31.334 poller_cost: 8721 (cyc), 3791 (nsec) 00:08:31.334 00:08:31.334 real 0m1.360s 00:08:31.334 user 0m1.211s 00:08:31.334 sys 0m0.141s 00:08:31.334 10:16:08 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.334 10:16:08 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:31.334 ************************************ 00:08:31.334 END TEST thread_poller_perf 00:08:31.334 ************************************ 00:08:31.592 10:16:08 thread -- common/autotest_common.sh@1142 -- # return 0 00:08:31.592 10:16:08 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:31.592 10:16:08 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:31.592 10:16:08 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.592 10:16:08 thread -- common/autotest_common.sh@10 -- # set +x 00:08:31.592 ************************************ 00:08:31.592 START TEST thread_poller_perf 00:08:31.592 ************************************ 00:08:31.592 10:16:08 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:31.592 [2024-07-15 10:16:08.610290] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:31.592 [2024-07-15 10:16:08.610355] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid446216 ] 00:08:31.592 [2024-07-15 10:16:08.741325] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.851 [2024-07-15 10:16:08.846529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.851 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:32.785 ====================================== 00:08:32.785 busy:2302844634 (cyc) 00:08:32.785 total_run_count: 3523000 00:08:32.785 tsc_hz: 2300000000 (cyc) 00:08:32.785 ====================================== 00:08:32.785 poller_cost: 653 (cyc), 283 (nsec) 00:08:32.785 00:08:32.785 real 0m1.358s 00:08:32.785 user 0m1.208s 00:08:32.785 sys 0m0.143s 00:08:32.785 10:16:09 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:32.785 10:16:09 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:32.785 ************************************ 00:08:32.785 END TEST thread_poller_perf 00:08:32.785 ************************************ 00:08:32.785 10:16:09 thread -- common/autotest_common.sh@1142 -- # return 0 00:08:32.785 10:16:09 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:08:32.785 00:08:32.785 real 0m2.987s 00:08:32.785 user 0m2.512s 00:08:32.785 sys 0m0.484s 00:08:33.044 10:16:09 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:33.044 10:16:09 thread -- common/autotest_common.sh@10 -- # set +x 00:08:33.044 ************************************ 00:08:33.044 END TEST thread 00:08:33.044 ************************************ 00:08:33.044 10:16:10 -- common/autotest_common.sh@1142 -- # return 0 00:08:33.044 10:16:10 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:33.044 10:16:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:33.044 10:16:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.044 10:16:10 -- common/autotest_common.sh@10 -- # set +x 00:08:33.044 ************************************ 00:08:33.044 START TEST accel 00:08:33.044 ************************************ 00:08:33.044 10:16:10 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:33.044 * Looking for test storage... 00:08:33.044 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:33.044 10:16:10 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:08:33.044 10:16:10 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:08:33.044 10:16:10 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:33.044 10:16:10 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=446456 00:08:33.044 10:16:10 accel -- accel/accel.sh@63 -- # waitforlisten 446456 00:08:33.044 10:16:10 accel -- common/autotest_common.sh@829 -- # '[' -z 446456 ']' 00:08:33.044 10:16:10 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:33.044 10:16:10 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:33.044 10:16:10 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:33.044 10:16:10 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:33.044 10:16:10 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:33.044 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:33.044 10:16:10 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:33.044 10:16:10 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:33.044 10:16:10 accel -- common/autotest_common.sh@10 -- # set +x 00:08:33.044 10:16:10 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:33.044 10:16:10 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:33.044 10:16:10 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:33.044 10:16:10 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:33.044 10:16:10 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:33.044 10:16:10 accel -- accel/accel.sh@41 -- # jq -r . 00:08:33.044 [2024-07-15 10:16:10.235410] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:33.044 [2024-07-15 10:16:10.235490] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid446456 ] 00:08:33.303 [2024-07-15 10:16:10.363589] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.303 [2024-07-15 10:16:10.461588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@862 -- # return 0 00:08:34.271 10:16:11 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:34.271 10:16:11 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:34.271 10:16:11 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:34.271 10:16:11 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:08:34.271 10:16:11 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:34.271 10:16:11 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:34.271 10:16:11 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # IFS== 00:08:34.271 10:16:11 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:34.271 10:16:11 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:34.271 10:16:11 accel -- accel/accel.sh@75 -- # killprocess 446456 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@948 -- # '[' -z 446456 ']' 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@952 -- # kill -0 446456 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@953 -- # uname 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 446456 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 446456' 00:08:34.271 killing process with pid 446456 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@967 -- # kill 446456 00:08:34.271 10:16:11 accel -- common/autotest_common.sh@972 -- # wait 446456 00:08:34.530 10:16:11 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:34.530 10:16:11 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:08:34.530 10:16:11 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:08:34.530 10:16:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.530 10:16:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:34.530 10:16:11 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:08:34.530 10:16:11 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:08:34.530 10:16:11 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.530 10:16:11 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:08:34.789 10:16:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:34.789 10:16:11 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:34.789 10:16:11 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:34.789 10:16:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.789 10:16:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:34.789 ************************************ 00:08:34.789 START TEST accel_missing_filename 00:08:34.789 ************************************ 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:34.789 10:16:11 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:08:34.789 10:16:11 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:08:34.789 [2024-07-15 10:16:11.841380] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:34.789 [2024-07-15 10:16:11.841443] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid446809 ] 00:08:34.789 [2024-07-15 10:16:11.956276] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.048 [2024-07-15 10:16:12.054622] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.048 [2024-07-15 10:16:12.115900] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:35.048 [2024-07-15 10:16:12.183467] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:08:35.307 A filename is required. 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:35.307 00:08:35.307 real 0m0.474s 00:08:35.307 user 0m0.318s 00:08:35.307 sys 0m0.182s 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:35.307 10:16:12 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:08:35.307 ************************************ 00:08:35.307 END TEST accel_missing_filename 00:08:35.307 ************************************ 00:08:35.307 10:16:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:35.307 10:16:12 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:35.307 10:16:12 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:35.307 10:16:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:35.307 10:16:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:35.307 ************************************ 00:08:35.307 START TEST accel_compress_verify 00:08:35.307 ************************************ 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:35.307 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:35.307 10:16:12 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:35.308 10:16:12 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:08:35.308 [2024-07-15 10:16:12.403142] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:35.308 [2024-07-15 10:16:12.403208] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid446860 ] 00:08:35.567 [2024-07-15 10:16:12.533653] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.567 [2024-07-15 10:16:12.637979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.567 [2024-07-15 10:16:12.706449] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:35.826 [2024-07-15 10:16:12.780463] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:08:35.826 00:08:35.826 Compression does not support the verify option, aborting. 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:35.826 00:08:35.826 real 0m0.512s 00:08:35.826 user 0m0.344s 00:08:35.826 sys 0m0.199s 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:35.826 10:16:12 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:08:35.826 ************************************ 00:08:35.826 END TEST accel_compress_verify 00:08:35.826 ************************************ 00:08:35.826 10:16:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:35.826 10:16:12 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:35.826 10:16:12 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:35.826 10:16:12 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:35.826 10:16:12 accel -- common/autotest_common.sh@10 -- # set +x 00:08:35.826 ************************************ 00:08:35.826 START TEST accel_wrong_workload 00:08:35.826 ************************************ 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:08:35.826 10:16:12 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:08:35.826 Unsupported workload type: foobar 00:08:35.826 [2024-07-15 10:16:12.994857] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:35.826 accel_perf options: 00:08:35.826 [-h help message] 00:08:35.826 [-q queue depth per core] 00:08:35.826 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:35.826 [-T number of threads per core 00:08:35.826 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:35.826 [-t time in seconds] 00:08:35.826 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:35.826 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:35.826 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:35.826 [-l for compress/decompress workloads, name of uncompressed input file 00:08:35.826 [-S for crc32c workload, use this seed value (default 0) 00:08:35.826 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:35.826 [-f for fill workload, use this BYTE value (default 255) 00:08:35.826 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:35.826 [-y verify result if this switch is on] 00:08:35.826 [-a tasks to allocate per core (default: same value as -q)] 00:08:35.826 Can be used to spread operations across a wider range of memory. 00:08:35.826 10:16:12 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:08:35.826 10:16:13 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:35.826 10:16:13 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:35.826 10:16:13 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:35.826 00:08:35.826 real 0m0.044s 00:08:35.826 user 0m0.028s 00:08:35.826 sys 0m0.016s 00:08:35.826 10:16:13 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:35.826 10:16:13 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:08:35.826 ************************************ 00:08:35.826 END TEST accel_wrong_workload 00:08:35.826 ************************************ 00:08:35.826 Error: writing output failed: Broken pipe 00:08:36.084 10:16:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:36.084 10:16:13 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:36.084 10:16:13 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:08:36.085 10:16:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.085 10:16:13 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.085 ************************************ 00:08:36.085 START TEST accel_negative_buffers 00:08:36.085 ************************************ 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:08:36.085 10:16:13 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:08:36.085 -x option must be non-negative. 00:08:36.085 [2024-07-15 10:16:13.123617] app.c:1450:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:36.085 accel_perf options: 00:08:36.085 [-h help message] 00:08:36.085 [-q queue depth per core] 00:08:36.085 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:36.085 [-T number of threads per core 00:08:36.085 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:36.085 [-t time in seconds] 00:08:36.085 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:36.085 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:36.085 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:36.085 [-l for compress/decompress workloads, name of uncompressed input file 00:08:36.085 [-S for crc32c workload, use this seed value (default 0) 00:08:36.085 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:36.085 [-f for fill workload, use this BYTE value (default 255) 00:08:36.085 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:36.085 [-y verify result if this switch is on] 00:08:36.085 [-a tasks to allocate per core (default: same value as -q)] 00:08:36.085 Can be used to spread operations across a wider range of memory. 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:36.085 00:08:36.085 real 0m0.043s 00:08:36.085 user 0m0.024s 00:08:36.085 sys 0m0.019s 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.085 10:16:13 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:08:36.085 ************************************ 00:08:36.085 END TEST accel_negative_buffers 00:08:36.085 ************************************ 00:08:36.085 Error: writing output failed: Broken pipe 00:08:36.085 10:16:13 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:36.085 10:16:13 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:36.085 10:16:13 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:36.085 10:16:13 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.085 10:16:13 accel -- common/autotest_common.sh@10 -- # set +x 00:08:36.085 ************************************ 00:08:36.085 START TEST accel_crc32c 00:08:36.085 ************************************ 00:08:36.085 10:16:13 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:36.085 10:16:13 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:36.085 [2024-07-15 10:16:13.226218] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:36.085 [2024-07-15 10:16:13.226275] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid447089 ] 00:08:36.344 [2024-07-15 10:16:13.351527] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.344 [2024-07-15 10:16:13.448957] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:36.344 10:16:13 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:37.719 10:16:14 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:37.719 00:08:37.719 real 0m1.473s 00:08:37.719 user 0m0.011s 00:08:37.719 sys 0m0.002s 00:08:37.719 10:16:14 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:37.719 10:16:14 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:37.719 ************************************ 00:08:37.719 END TEST accel_crc32c 00:08:37.719 ************************************ 00:08:37.719 10:16:14 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:37.719 10:16:14 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:37.719 10:16:14 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:37.719 10:16:14 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.719 10:16:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:37.719 ************************************ 00:08:37.719 START TEST accel_crc32c_C2 00:08:37.719 ************************************ 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:37.719 10:16:14 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:37.719 [2024-07-15 10:16:14.772002] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:37.719 [2024-07-15 10:16:14.772057] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid447283 ] 00:08:37.719 [2024-07-15 10:16:14.900702] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.978 [2024-07-15 10:16:15.002720] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.978 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:37.979 10:16:15 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:39.356 00:08:39.356 real 0m1.501s 00:08:39.356 user 0m0.009s 00:08:39.356 sys 0m0.003s 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.356 10:16:16 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:39.356 ************************************ 00:08:39.356 END TEST accel_crc32c_C2 00:08:39.356 ************************************ 00:08:39.356 10:16:16 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:39.356 10:16:16 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:39.356 10:16:16 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:39.356 10:16:16 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.356 10:16:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:39.356 ************************************ 00:08:39.356 START TEST accel_copy 00:08:39.356 ************************************ 00:08:39.356 10:16:16 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:39.356 10:16:16 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:08:39.356 [2024-07-15 10:16:16.361226] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:39.356 [2024-07-15 10:16:16.361357] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid447480 ] 00:08:39.615 [2024-07-15 10:16:16.556912] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.615 [2024-07-15 10:16:16.657548] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.615 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:39.616 10:16:16 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:08:40.989 10:16:17 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:40.989 00:08:40.989 real 0m1.583s 00:08:40.989 user 0m0.012s 00:08:40.989 sys 0m0.000s 00:08:40.989 10:16:17 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:40.989 10:16:17 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:08:40.989 ************************************ 00:08:40.989 END TEST accel_copy 00:08:40.989 ************************************ 00:08:40.989 10:16:17 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:40.989 10:16:17 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:40.989 10:16:17 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:08:40.989 10:16:17 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:40.989 10:16:17 accel -- common/autotest_common.sh@10 -- # set +x 00:08:40.989 ************************************ 00:08:40.989 START TEST accel_fill 00:08:40.989 ************************************ 00:08:40.989 10:16:17 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:40.989 10:16:17 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:08:40.989 10:16:17 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:08:40.989 10:16:17 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:40.989 10:16:17 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:40.989 10:16:17 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:40.989 10:16:17 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:08:40.990 10:16:17 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:08:40.990 [2024-07-15 10:16:18.000201] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:40.990 [2024-07-15 10:16:18.000261] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid447676 ] 00:08:40.990 [2024-07-15 10:16:18.128942] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.247 [2024-07-15 10:16:18.230997] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.247 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:41.248 10:16:18 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:08:42.657 10:16:19 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:42.657 00:08:42.657 real 0m1.506s 00:08:42.657 user 0m0.009s 00:08:42.657 sys 0m0.003s 00:08:42.657 10:16:19 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:42.657 10:16:19 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:08:42.657 ************************************ 00:08:42.657 END TEST accel_fill 00:08:42.657 ************************************ 00:08:42.657 10:16:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:42.657 10:16:19 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:42.657 10:16:19 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:42.657 10:16:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:42.657 10:16:19 accel -- common/autotest_common.sh@10 -- # set +x 00:08:42.657 ************************************ 00:08:42.657 START TEST accel_copy_crc32c 00:08:42.658 ************************************ 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:42.658 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:42.658 [2024-07-15 10:16:19.574615] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:42.658 [2024-07-15 10:16:19.574674] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid447879 ] 00:08:42.658 [2024-07-15 10:16:19.703858] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.658 [2024-07-15 10:16:19.801053] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.916 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:42.917 10:16:19 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:43.851 00:08:43.851 real 0m1.489s 00:08:43.851 user 0m0.010s 00:08:43.851 sys 0m0.002s 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:43.851 10:16:21 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:43.851 ************************************ 00:08:43.851 END TEST accel_copy_crc32c 00:08:43.851 ************************************ 00:08:44.109 10:16:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:44.109 10:16:21 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:44.109 10:16:21 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:44.109 10:16:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:44.109 10:16:21 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.109 ************************************ 00:08:44.109 START TEST accel_copy_crc32c_C2 00:08:44.109 ************************************ 00:08:44.109 10:16:21 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:44.109 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:44.109 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:44.109 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.109 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.109 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:44.109 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:44.110 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:44.110 [2024-07-15 10:16:21.131889] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:44.110 [2024-07-15 10:16:21.131955] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid448181 ] 00:08:44.110 [2024-07-15 10:16:21.248768] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.368 [2024-07-15 10:16:21.349849] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.368 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:44.369 10:16:21 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:45.745 00:08:45.745 real 0m1.490s 00:08:45.745 user 0m0.007s 00:08:45.745 sys 0m0.005s 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:45.745 10:16:22 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:45.745 ************************************ 00:08:45.745 END TEST accel_copy_crc32c_C2 00:08:45.745 ************************************ 00:08:45.745 10:16:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:45.745 10:16:22 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:45.745 10:16:22 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:45.745 10:16:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:45.745 10:16:22 accel -- common/autotest_common.sh@10 -- # set +x 00:08:45.745 ************************************ 00:08:45.745 START TEST accel_dualcast 00:08:45.745 ************************************ 00:08:45.745 10:16:22 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:08:45.745 10:16:22 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:08:45.745 [2024-07-15 10:16:22.700815] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:45.745 [2024-07-15 10:16:22.700880] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid448430 ] 00:08:45.745 [2024-07-15 10:16:22.832075] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.745 [2024-07-15 10:16:22.936612] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:46.005 10:16:23 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:08:47.378 10:16:24 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:47.378 00:08:47.378 real 0m1.512s 00:08:47.378 user 0m0.013s 00:08:47.378 sys 0m0.000s 00:08:47.378 10:16:24 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:47.378 10:16:24 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:08:47.378 ************************************ 00:08:47.378 END TEST accel_dualcast 00:08:47.378 ************************************ 00:08:47.378 10:16:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:47.378 10:16:24 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:47.378 10:16:24 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:47.378 10:16:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.378 10:16:24 accel -- common/autotest_common.sh@10 -- # set +x 00:08:47.378 ************************************ 00:08:47.378 START TEST accel_compare 00:08:47.378 ************************************ 00:08:47.378 10:16:24 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:08:47.378 [2024-07-15 10:16:24.269235] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:47.378 [2024-07-15 10:16:24.269297] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid448643 ] 00:08:47.378 [2024-07-15 10:16:24.395440] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.378 [2024-07-15 10:16:24.496427] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:47.378 10:16:24 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:08:48.754 10:16:25 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:48.754 00:08:48.754 real 0m1.495s 00:08:48.754 user 0m0.006s 00:08:48.754 sys 0m0.006s 00:08:48.754 10:16:25 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:48.754 10:16:25 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:08:48.754 ************************************ 00:08:48.754 END TEST accel_compare 00:08:48.754 ************************************ 00:08:48.754 10:16:25 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:48.754 10:16:25 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:48.754 10:16:25 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:08:48.755 10:16:25 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:48.755 10:16:25 accel -- common/autotest_common.sh@10 -- # set +x 00:08:48.755 ************************************ 00:08:48.755 START TEST accel_xor 00:08:48.755 ************************************ 00:08:48.755 10:16:25 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:48.755 10:16:25 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:48.755 [2024-07-15 10:16:25.833226] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:48.755 [2024-07-15 10:16:25.833285] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid448840 ] 00:08:49.014 [2024-07-15 10:16:25.960939] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.014 [2024-07-15 10:16:26.063411] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:49.014 10:16:26 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:50.390 00:08:50.390 real 0m1.506s 00:08:50.390 user 0m0.012s 00:08:50.390 sys 0m0.000s 00:08:50.390 10:16:27 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:50.390 10:16:27 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:50.390 ************************************ 00:08:50.390 END TEST accel_xor 00:08:50.390 ************************************ 00:08:50.390 10:16:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:50.390 10:16:27 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:50.390 10:16:27 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:50.390 10:16:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:50.390 10:16:27 accel -- common/autotest_common.sh@10 -- # set +x 00:08:50.390 ************************************ 00:08:50.390 START TEST accel_xor 00:08:50.390 ************************************ 00:08:50.390 10:16:27 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:50.390 10:16:27 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:50.390 [2024-07-15 10:16:27.408400] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:50.390 [2024-07-15 10:16:27.408460] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid449037 ] 00:08:50.390 [2024-07-15 10:16:27.537512] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.648 [2024-07-15 10:16:27.635540] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.648 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:50.649 10:16:27 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:52.025 10:16:28 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:52.025 00:08:52.025 real 0m1.493s 00:08:52.025 user 0m0.010s 00:08:52.025 sys 0m0.002s 00:08:52.025 10:16:28 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:52.025 10:16:28 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:52.025 ************************************ 00:08:52.025 END TEST accel_xor 00:08:52.025 ************************************ 00:08:52.025 10:16:28 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:52.025 10:16:28 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:52.025 10:16:28 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:52.025 10:16:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:52.025 10:16:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:52.025 ************************************ 00:08:52.025 START TEST accel_dif_verify 00:08:52.025 ************************************ 00:08:52.025 10:16:28 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:52.025 10:16:28 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:08:52.025 [2024-07-15 10:16:28.984128] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:52.025 [2024-07-15 10:16:28.984254] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid449244 ] 00:08:52.025 [2024-07-15 10:16:29.180684] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.284 [2024-07-15 10:16:29.287687] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:52.284 10:16:29 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:53.659 10:16:30 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:53.659 00:08:53.659 real 0m1.592s 00:08:53.659 user 0m0.010s 00:08:53.659 sys 0m0.003s 00:08:53.659 10:16:30 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:53.659 10:16:30 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:53.659 ************************************ 00:08:53.659 END TEST accel_dif_verify 00:08:53.659 ************************************ 00:08:53.659 10:16:30 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:53.659 10:16:30 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:53.659 10:16:30 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:53.659 10:16:30 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:53.659 10:16:30 accel -- common/autotest_common.sh@10 -- # set +x 00:08:53.659 ************************************ 00:08:53.659 START TEST accel_dif_generate 00:08:53.659 ************************************ 00:08:53.659 10:16:30 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:53.659 10:16:30 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:53.659 [2024-07-15 10:16:30.626570] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:53.659 [2024-07-15 10:16:30.626629] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid449598 ] 00:08:53.659 [2024-07-15 10:16:30.755611] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.918 [2024-07-15 10:16:30.857550] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:53.918 10:16:30 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:55.290 10:16:32 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:55.290 00:08:55.290 real 0m1.504s 00:08:55.290 user 0m0.011s 00:08:55.290 sys 0m0.002s 00:08:55.290 10:16:32 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:55.290 10:16:32 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:55.290 ************************************ 00:08:55.290 END TEST accel_dif_generate 00:08:55.290 ************************************ 00:08:55.290 10:16:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:55.290 10:16:32 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:55.290 10:16:32 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:55.290 10:16:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:55.290 10:16:32 accel -- common/autotest_common.sh@10 -- # set +x 00:08:55.290 ************************************ 00:08:55.290 START TEST accel_dif_generate_copy 00:08:55.290 ************************************ 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:55.290 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:55.291 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:55.291 [2024-07-15 10:16:32.213633] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:55.291 [2024-07-15 10:16:32.213704] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid449800 ] 00:08:55.291 [2024-07-15 10:16:32.343321] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.291 [2024-07-15 10:16:32.447956] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:55.549 10:16:32 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:56.921 00:08:56.921 real 0m1.513s 00:08:56.921 user 0m0.012s 00:08:56.921 sys 0m0.002s 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:56.921 10:16:33 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:56.921 ************************************ 00:08:56.921 END TEST accel_dif_generate_copy 00:08:56.921 ************************************ 00:08:56.921 10:16:33 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:56.921 10:16:33 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:56.921 10:16:33 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:56.921 10:16:33 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:08:56.921 10:16:33 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:56.921 10:16:33 accel -- common/autotest_common.sh@10 -- # set +x 00:08:56.921 ************************************ 00:08:56.921 START TEST accel_comp 00:08:56.921 ************************************ 00:08:56.921 10:16:33 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:56.921 10:16:33 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:56.921 [2024-07-15 10:16:33.798012] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:56.921 [2024-07-15 10:16:33.798070] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450005 ] 00:08:56.921 [2024-07-15 10:16:33.924732] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.921 [2024-07-15 10:16:34.023316] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:56.921 10:16:34 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:58.294 10:16:35 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:58.294 00:08:58.294 real 0m1.495s 00:08:58.294 user 0m0.009s 00:08:58.294 sys 0m0.004s 00:08:58.294 10:16:35 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:58.294 10:16:35 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:58.294 ************************************ 00:08:58.294 END TEST accel_comp 00:08:58.294 ************************************ 00:08:58.294 10:16:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:08:58.294 10:16:35 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:58.294 10:16:35 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:08:58.295 10:16:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:58.295 10:16:35 accel -- common/autotest_common.sh@10 -- # set +x 00:08:58.295 ************************************ 00:08:58.295 START TEST accel_decomp 00:08:58.295 ************************************ 00:08:58.295 10:16:35 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:58.295 10:16:35 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:58.295 [2024-07-15 10:16:35.378874] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:08:58.295 [2024-07-15 10:16:35.379008] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450202 ] 00:08:58.553 [2024-07-15 10:16:35.575312] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.553 [2024-07-15 10:16:35.684034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:58.811 10:16:35 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:59.744 10:16:36 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:59.744 00:08:59.744 real 0m1.584s 00:08:59.744 user 0m0.012s 00:08:59.744 sys 0m0.001s 00:08:59.744 10:16:36 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:59.744 10:16:36 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:59.744 ************************************ 00:08:59.744 END TEST accel_decomp 00:08:59.744 ************************************ 00:09:00.002 10:16:36 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:00.002 10:16:36 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:00.002 10:16:36 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:00.002 10:16:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:00.002 10:16:36 accel -- common/autotest_common.sh@10 -- # set +x 00:09:00.002 ************************************ 00:09:00.002 START TEST accel_decomp_full 00:09:00.002 ************************************ 00:09:00.002 10:16:37 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:00.002 10:16:37 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:00.002 [2024-07-15 10:16:37.035646] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:00.002 [2024-07-15 10:16:37.035714] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450404 ] 00:09:00.002 [2024-07-15 10:16:37.165873] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:00.260 [2024-07-15 10:16:37.270179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:00.260 10:16:37 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:01.632 10:16:38 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:01.632 00:09:01.632 real 0m1.530s 00:09:01.632 user 0m0.012s 00:09:01.632 sys 0m0.002s 00:09:01.632 10:16:38 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:01.632 10:16:38 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:01.632 ************************************ 00:09:01.632 END TEST accel_decomp_full 00:09:01.632 ************************************ 00:09:01.632 10:16:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:01.632 10:16:38 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:01.632 10:16:38 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:01.632 10:16:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.632 10:16:38 accel -- common/autotest_common.sh@10 -- # set +x 00:09:01.632 ************************************ 00:09:01.632 START TEST accel_decomp_mcore 00:09:01.632 ************************************ 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:01.632 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:01.632 [2024-07-15 10:16:38.634469] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:01.632 [2024-07-15 10:16:38.634528] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450710 ] 00:09:01.632 [2024-07-15 10:16:38.763193] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:01.891 [2024-07-15 10:16:38.869018] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:01.891 [2024-07-15 10:16:38.869105] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:01.891 [2024-07-15 10:16:38.869184] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:01.891 [2024-07-15 10:16:38.869189] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:01.891 10:16:38 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.302 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:03.303 00:09:03.303 real 0m1.514s 00:09:03.303 user 0m4.746s 00:09:03.303 sys 0m0.208s 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:03.303 10:16:40 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:03.303 ************************************ 00:09:03.303 END TEST accel_decomp_mcore 00:09:03.303 ************************************ 00:09:03.303 10:16:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:03.303 10:16:40 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:03.303 10:16:40 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:03.303 10:16:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:03.303 10:16:40 accel -- common/autotest_common.sh@10 -- # set +x 00:09:03.303 ************************************ 00:09:03.303 START TEST accel_decomp_full_mcore 00:09:03.303 ************************************ 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:03.303 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:03.303 [2024-07-15 10:16:40.217893] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:03.303 [2024-07-15 10:16:40.217958] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid450957 ] 00:09:03.303 [2024-07-15 10:16:40.347039] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:03.303 [2024-07-15 10:16:40.453786] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:03.303 [2024-07-15 10:16:40.453809] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:03.303 [2024-07-15 10:16:40.453871] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:03.303 [2024-07-15 10:16:40.453875] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:03.563 10:16:40 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:04.942 00:09:04.942 real 0m1.529s 00:09:04.942 user 0m4.774s 00:09:04.942 sys 0m0.227s 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.942 10:16:41 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:04.942 ************************************ 00:09:04.942 END TEST accel_decomp_full_mcore 00:09:04.942 ************************************ 00:09:04.942 10:16:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:04.942 10:16:41 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:04.942 10:16:41 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:04.942 10:16:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.942 10:16:41 accel -- common/autotest_common.sh@10 -- # set +x 00:09:04.942 ************************************ 00:09:04.942 START TEST accel_decomp_mthread 00:09:04.942 ************************************ 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:04.942 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:04.943 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:04.943 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:04.943 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:04.943 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:04.943 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:04.943 10:16:41 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:04.943 [2024-07-15 10:16:41.825484] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:04.943 [2024-07-15 10:16:41.825542] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451164 ] 00:09:04.943 [2024-07-15 10:16:41.951441] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.943 [2024-07-15 10:16:42.047924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:04.943 10:16:42 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:06.324 00:09:06.324 real 0m1.497s 00:09:06.324 user 0m1.317s 00:09:06.324 sys 0m0.184s 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:06.324 10:16:43 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:06.324 ************************************ 00:09:06.324 END TEST accel_decomp_mthread 00:09:06.324 ************************************ 00:09:06.324 10:16:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:06.324 10:16:43 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:06.324 10:16:43 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:06.324 10:16:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:06.324 10:16:43 accel -- common/autotest_common.sh@10 -- # set +x 00:09:06.324 ************************************ 00:09:06.324 START TEST accel_decomp_full_mthread 00:09:06.324 ************************************ 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:06.324 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:06.324 [2024-07-15 10:16:43.402277] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:06.324 [2024-07-15 10:16:43.402340] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451357 ] 00:09:06.584 [2024-07-15 10:16:43.530995] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.584 [2024-07-15 10:16:43.631757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.584 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:06.585 10:16:43 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:07.961 00:09:07.961 real 0m1.543s 00:09:07.961 user 0m1.357s 00:09:07.961 sys 0m0.191s 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.961 10:16:44 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:07.961 ************************************ 00:09:07.961 END TEST accel_decomp_full_mthread 00:09:07.961 ************************************ 00:09:07.961 10:16:44 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:07.961 10:16:44 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:09:07.961 10:16:44 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:09:07.961 10:16:44 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:09:07.961 10:16:44 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:07.961 10:16:44 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=451556 00:09:07.961 10:16:44 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:07.961 10:16:44 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:07.961 10:16:44 accel -- accel/accel.sh@63 -- # waitforlisten 451556 00:09:07.961 10:16:44 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:07.961 10:16:44 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:07.961 10:16:44 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.961 10:16:44 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.961 10:16:44 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:07.961 10:16:44 accel -- common/autotest_common.sh@829 -- # '[' -z 451556 ']' 00:09:07.961 10:16:44 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:07.961 10:16:44 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:07.961 10:16:44 accel -- accel/accel.sh@41 -- # jq -r . 00:09:07.961 10:16:44 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:07.961 10:16:44 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:07.962 10:16:44 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:07.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:07.962 10:16:44 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:07.962 10:16:44 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.962 [2024-07-15 10:16:45.013026] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:07.962 [2024-07-15 10:16:45.013097] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451556 ] 00:09:07.962 [2024-07-15 10:16:45.139507] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.221 [2024-07-15 10:16:45.247456] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.158 [2024-07-15 10:16:46.019984] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:09.158 10:16:46 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:09.158 10:16:46 accel -- common/autotest_common.sh@862 -- # return 0 00:09:09.158 10:16:46 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:09.158 10:16:46 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:09.158 10:16:46 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:09.159 10:16:46 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:09:09.159 10:16:46 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:09:09.159 10:16:46 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:09:09.159 10:16:46 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.159 10:16:46 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:09:09.159 10:16:46 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.159 10:16:46 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.418 "method": "compressdev_scan_accel_module", 00:09:09.418 10:16:46 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:09.418 10:16:46 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.418 10:16:46 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # IFS== 00:09:09.418 10:16:46 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:09.418 10:16:46 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:09.418 10:16:46 accel -- accel/accel.sh@75 -- # killprocess 451556 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@948 -- # '[' -z 451556 ']' 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@952 -- # kill -0 451556 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@953 -- # uname 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 451556 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:09.418 10:16:46 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 451556' 00:09:09.418 killing process with pid 451556 00:09:09.419 10:16:46 accel -- common/autotest_common.sh@967 -- # kill 451556 00:09:09.419 10:16:46 accel -- common/autotest_common.sh@972 -- # wait 451556 00:09:09.987 10:16:46 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:09.987 10:16:46 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:09.987 10:16:46 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:09.987 10:16:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.987 10:16:46 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.987 ************************************ 00:09:09.987 START TEST accel_cdev_comp 00:09:09.987 ************************************ 00:09:09.987 10:16:46 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:09.987 10:16:46 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:09:09.987 [2024-07-15 10:16:46.960635] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:09.987 [2024-07-15 10:16:46.960702] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid451914 ] 00:09:09.987 [2024-07-15 10:16:47.092430] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.246 [2024-07-15 10:16:47.196912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.815 [2024-07-15 10:16:47.957526] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:10.815 [2024-07-15 10:16:47.960169] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1754080 PMD being used: compress_qat 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 [2024-07-15 10:16:47.964298] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1758e60 PMD being used: compress_qat 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:10.815 10:16:47 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:12.193 10:16:49 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:12.193 00:09:12.193 real 0m2.219s 00:09:12.193 user 0m0.013s 00:09:12.193 sys 0m0.001s 00:09:12.193 10:16:49 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:12.193 10:16:49 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:09:12.193 ************************************ 00:09:12.193 END TEST accel_cdev_comp 00:09:12.193 ************************************ 00:09:12.193 10:16:49 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:12.193 10:16:49 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.193 10:16:49 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:12.193 10:16:49 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.193 10:16:49 accel -- common/autotest_common.sh@10 -- # set +x 00:09:12.193 ************************************ 00:09:12.193 START TEST accel_cdev_decomp 00:09:12.193 ************************************ 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:12.193 10:16:49 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:12.193 [2024-07-15 10:16:49.224067] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:12.193 [2024-07-15 10:16:49.224109] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid452121 ] 00:09:12.193 [2024-07-15 10:16:49.335675] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.452 [2024-07-15 10:16:49.437252] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.020 [2024-07-15 10:16:50.201718] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:13.020 [2024-07-15 10:16:50.204339] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20e7080 PMD being used: compress_qat 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 [2024-07-15 10:16:50.208511] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20ebe60 PMD being used: compress_qat 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:13.020 10:16:50 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:14.398 00:09:14.398 real 0m2.163s 00:09:14.398 user 0m0.013s 00:09:14.398 sys 0m0.000s 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:14.398 10:16:51 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:14.398 ************************************ 00:09:14.398 END TEST accel_cdev_decomp 00:09:14.398 ************************************ 00:09:14.398 10:16:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:14.398 10:16:51 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:14.398 10:16:51 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:14.398 10:16:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:14.398 10:16:51 accel -- common/autotest_common.sh@10 -- # set +x 00:09:14.398 ************************************ 00:09:14.398 START TEST accel_cdev_decomp_full 00:09:14.398 ************************************ 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:14.398 10:16:51 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:14.398 [2024-07-15 10:16:51.460409] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:14.398 [2024-07-15 10:16:51.460466] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid452485 ] 00:09:14.398 [2024-07-15 10:16:51.586472] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.656 [2024-07-15 10:16:51.687843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.590 [2024-07-15 10:16:52.454824] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:15.590 [2024-07-15 10:16:52.457380] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dd4080 PMD being used: compress_qat 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.590 [2024-07-15 10:16:52.460689] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1dd3ce0 PMD being used: compress_qat 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.590 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:15.591 10:16:52 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:16.525 00:09:16.525 real 0m2.205s 00:09:16.525 user 0m0.011s 00:09:16.525 sys 0m0.002s 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:16.525 10:16:53 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:16.525 ************************************ 00:09:16.525 END TEST accel_cdev_decomp_full 00:09:16.525 ************************************ 00:09:16.525 10:16:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:16.525 10:16:53 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:16.525 10:16:53 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:16.525 10:16:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.525 10:16:53 accel -- common/autotest_common.sh@10 -- # set +x 00:09:16.525 ************************************ 00:09:16.525 START TEST accel_cdev_decomp_mcore 00:09:16.525 ************************************ 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:16.525 10:16:53 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:16.784 [2024-07-15 10:16:53.746387] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:16.784 [2024-07-15 10:16:53.746446] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid452848 ] 00:09:16.784 [2024-07-15 10:16:53.874812] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:16.784 [2024-07-15 10:16:53.975304] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:16.784 [2024-07-15 10:16:53.975389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:16.784 [2024-07-15 10:16:53.975463] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:16.784 [2024-07-15 10:16:53.975468] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:17.736 [2024-07-15 10:16:54.728526] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:17.736 [2024-07-15 10:16:54.731139] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16f3720 PMD being used: compress_qat 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:17.736 [2024-07-15 10:16:54.736787] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8acc19b8b0 PMD being used: compress_qat 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 [2024-07-15 10:16:54.737541] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8ac419b8b0 PMD being used: compress_qat 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.736 [2024-07-15 10:16:54.738627] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16f89f0 PMD being used: compress_qat 00:09:17.736 [2024-07-15 10:16:54.738819] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8abc19b8b0 PMD being used: compress_qat 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.736 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:17.737 10:16:54 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:19.165 00:09:19.165 real 0m2.222s 00:09:19.165 user 0m7.195s 00:09:19.165 sys 0m0.592s 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:19.165 10:16:55 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:19.165 ************************************ 00:09:19.165 END TEST accel_cdev_decomp_mcore 00:09:19.165 ************************************ 00:09:19.165 10:16:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:19.165 10:16:55 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:19.165 10:16:55 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:19.165 10:16:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.165 10:16:55 accel -- common/autotest_common.sh@10 -- # set +x 00:09:19.165 ************************************ 00:09:19.165 START TEST accel_cdev_decomp_full_mcore 00:09:19.165 ************************************ 00:09:19.165 10:16:55 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:19.165 10:16:55 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:19.165 10:16:56 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:19.165 [2024-07-15 10:16:56.033818] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:19.165 [2024-07-15 10:16:56.033879] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid453055 ] 00:09:19.165 [2024-07-15 10:16:56.163509] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:19.165 [2024-07-15 10:16:56.269618] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:19.165 [2024-07-15 10:16:56.269704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:19.165 [2024-07-15 10:16:56.269781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:19.165 [2024-07-15 10:16:56.269785] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.104 [2024-07-15 10:16:57.035086] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:20.104 [2024-07-15 10:16:57.037679] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8b5720 PMD being used: compress_qat 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 [2024-07-15 10:16:57.042427] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7efd3819b8b0 PMD being used: compress_qat 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 [2024-07-15 10:16:57.043207] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7efd3019b8b0 PMD being used: compress_qat 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 [2024-07-15 10:16:57.044336] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8b8a30 PMD being used: compress_qat 00:09:20.104 [2024-07-15 10:16:57.044542] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7efd2819b8b0 PMD being used: compress_qat 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:20.104 10:16:57 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.042 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.302 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:21.303 00:09:21.303 real 0m2.243s 00:09:21.303 user 0m7.244s 00:09:21.303 sys 0m0.592s 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:21.303 10:16:58 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:21.303 ************************************ 00:09:21.303 END TEST accel_cdev_decomp_full_mcore 00:09:21.303 ************************************ 00:09:21.303 10:16:58 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:21.303 10:16:58 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:21.303 10:16:58 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:21.303 10:16:58 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:21.303 10:16:58 accel -- common/autotest_common.sh@10 -- # set +x 00:09:21.303 ************************************ 00:09:21.303 START TEST accel_cdev_decomp_mthread 00:09:21.303 ************************************ 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:21.303 10:16:58 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:21.303 [2024-07-15 10:16:58.353667] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:21.303 [2024-07-15 10:16:58.353725] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid453428 ] 00:09:21.303 [2024-07-15 10:16:58.481714] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:21.562 [2024-07-15 10:16:58.583301] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.184 [2024-07-15 10:16:59.341218] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:22.184 [2024-07-15 10:16:59.343795] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7c6080 PMD being used: compress_qat 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 [2024-07-15 10:16:59.348678] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7cb2a0 PMD being used: compress_qat 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 [2024-07-15 10:16:59.351206] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x8ee0f0 PMD being used: compress_qat 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:22.184 10:16:59 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:23.560 00:09:23.560 real 0m2.205s 00:09:23.560 user 0m1.636s 00:09:23.560 sys 0m0.573s 00:09:23.560 10:17:00 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.561 10:17:00 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:23.561 ************************************ 00:09:23.561 END TEST accel_cdev_decomp_mthread 00:09:23.561 ************************************ 00:09:23.561 10:17:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:23.561 10:17:00 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:23.561 10:17:00 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:23.561 10:17:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.561 10:17:00 accel -- common/autotest_common.sh@10 -- # set +x 00:09:23.561 ************************************ 00:09:23.561 START TEST accel_cdev_decomp_full_mthread 00:09:23.561 ************************************ 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:23.561 10:17:00 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:23.561 [2024-07-15 10:17:00.633101] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:23.561 [2024-07-15 10:17:00.633161] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid453792 ] 00:09:23.819 [2024-07-15 10:17:00.760476] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.819 [2024-07-15 10:17:00.860763] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.756 [2024-07-15 10:17:01.624138] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:24.756 [2024-07-15 10:17:01.626801] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf4b080 PMD being used: compress_qat 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 [2024-07-15 10:17:01.631044] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf4e3b0 PMD being used: compress_qat 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 [2024-07-15 10:17:01.634000] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1072cc0 PMD being used: compress_qat 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.756 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:24.757 10:17:01 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:25.693 00:09:25.693 real 0m2.222s 00:09:25.693 user 0m1.624s 00:09:25.693 sys 0m0.602s 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:25.693 10:17:02 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:25.693 ************************************ 00:09:25.693 END TEST accel_cdev_decomp_full_mthread 00:09:25.693 ************************************ 00:09:25.693 10:17:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:25.693 10:17:02 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:09:25.693 10:17:02 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:25.693 10:17:02 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:25.693 10:17:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:25.693 10:17:02 accel -- common/autotest_common.sh@10 -- # set +x 00:09:25.693 10:17:02 accel -- accel/accel.sh@137 -- # build_accel_config 00:09:25.693 10:17:02 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:25.693 10:17:02 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:25.693 10:17:02 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:25.693 10:17:02 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:25.693 10:17:02 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:25.693 10:17:02 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:25.693 10:17:02 accel -- accel/accel.sh@41 -- # jq -r . 00:09:25.952 ************************************ 00:09:25.952 START TEST accel_dif_functional_tests 00:09:25.952 ************************************ 00:09:25.952 10:17:02 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:25.952 [2024-07-15 10:17:02.963375] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:25.952 [2024-07-15 10:17:02.963446] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid454004 ] 00:09:25.952 [2024-07-15 10:17:03.092637] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:26.211 [2024-07-15 10:17:03.201128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:26.211 [2024-07-15 10:17:03.201215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:26.211 [2024-07-15 10:17:03.201219] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.211 00:09:26.211 00:09:26.211 CUnit - A unit testing framework for C - Version 2.1-3 00:09:26.211 http://cunit.sourceforge.net/ 00:09:26.211 00:09:26.211 00:09:26.211 Suite: accel_dif 00:09:26.211 Test: verify: DIF generated, GUARD check ...passed 00:09:26.211 Test: verify: DIF generated, APPTAG check ...passed 00:09:26.211 Test: verify: DIF generated, REFTAG check ...passed 00:09:26.211 Test: verify: DIF not generated, GUARD check ...[2024-07-15 10:17:03.309984] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:26.211 passed 00:09:26.211 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 10:17:03.310054] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:26.211 passed 00:09:26.211 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 10:17:03.310086] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:26.211 passed 00:09:26.211 Test: verify: APPTAG correct, APPTAG check ...passed 00:09:26.211 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 10:17:03.310155] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:09:26.211 passed 00:09:26.211 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:09:26.211 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:09:26.211 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:09:26.211 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 10:17:03.310301] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:09:26.211 passed 00:09:26.211 Test: verify copy: DIF generated, GUARD check ...passed 00:09:26.211 Test: verify copy: DIF generated, APPTAG check ...passed 00:09:26.211 Test: verify copy: DIF generated, REFTAG check ...passed 00:09:26.211 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 10:17:03.310457] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:26.211 passed 00:09:26.211 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 10:17:03.310492] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:26.211 passed 00:09:26.211 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 10:17:03.310527] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:26.211 passed 00:09:26.211 Test: generate copy: DIF generated, GUARD check ...passed 00:09:26.211 Test: generate copy: DIF generated, APTTAG check ...passed 00:09:26.211 Test: generate copy: DIF generated, REFTAG check ...passed 00:09:26.211 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:09:26.211 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:09:26.211 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:09:26.211 Test: generate copy: iovecs-len validate ...[2024-07-15 10:17:03.310767] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:09:26.211 passed 00:09:26.211 Test: generate copy: buffer alignment validate ...passed 00:09:26.211 00:09:26.211 Run Summary: Type Total Ran Passed Failed Inactive 00:09:26.211 suites 1 1 n/a 0 0 00:09:26.211 tests 26 26 26 0 0 00:09:26.211 asserts 115 115 115 0 n/a 00:09:26.211 00:09:26.211 Elapsed time = 0.003 seconds 00:09:26.469 00:09:26.469 real 0m0.642s 00:09:26.469 user 0m0.854s 00:09:26.469 sys 0m0.243s 00:09:26.469 10:17:03 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:26.469 10:17:03 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:09:26.469 ************************************ 00:09:26.469 END TEST accel_dif_functional_tests 00:09:26.469 ************************************ 00:09:26.469 10:17:03 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:26.469 00:09:26.469 real 0m53.512s 00:09:26.469 user 1m1.628s 00:09:26.469 sys 0m11.867s 00:09:26.469 10:17:03 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:26.469 10:17:03 accel -- common/autotest_common.sh@10 -- # set +x 00:09:26.469 ************************************ 00:09:26.469 END TEST accel 00:09:26.469 ************************************ 00:09:26.469 10:17:03 -- common/autotest_common.sh@1142 -- # return 0 00:09:26.469 10:17:03 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:26.469 10:17:03 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:26.469 10:17:03 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:26.469 10:17:03 -- common/autotest_common.sh@10 -- # set +x 00:09:26.469 ************************************ 00:09:26.469 START TEST accel_rpc 00:09:26.469 ************************************ 00:09:26.469 10:17:03 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:26.728 * Looking for test storage... 00:09:26.728 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:26.728 10:17:03 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:26.728 10:17:03 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=454229 00:09:26.728 10:17:03 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 454229 00:09:26.728 10:17:03 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 454229 ']' 00:09:26.728 10:17:03 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:26.728 10:17:03 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:26.728 10:17:03 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:26.728 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:26.728 10:17:03 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:26.728 10:17:03 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:09:26.728 10:17:03 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:26.728 [2024-07-15 10:17:03.833059] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:26.728 [2024-07-15 10:17:03.833133] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid454229 ] 00:09:26.985 [2024-07-15 10:17:03.963015] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:26.985 [2024-07-15 10:17:04.066550] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.550 10:17:04 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:27.550 10:17:04 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:09:27.550 10:17:04 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:09:27.550 10:17:04 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:09:27.550 10:17:04 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:09:27.550 10:17:04 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:09:27.550 10:17:04 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:09:27.550 10:17:04 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:27.550 10:17:04 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:27.550 10:17:04 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:27.809 ************************************ 00:09:27.809 START TEST accel_assign_opcode 00:09:27.809 ************************************ 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:27.809 [2024-07-15 10:17:04.780912] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:27.809 [2024-07-15 10:17:04.788929] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:27.809 10:17:04 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:28.068 software 00:09:28.068 00:09:28.068 real 0m0.334s 00:09:28.068 user 0m0.073s 00:09:28.068 sys 0m0.010s 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:28.068 10:17:05 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:28.068 ************************************ 00:09:28.068 END TEST accel_assign_opcode 00:09:28.068 ************************************ 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:09:28.068 10:17:05 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 454229 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 454229 ']' 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 454229 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 454229 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 454229' 00:09:28.068 killing process with pid 454229 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@967 -- # kill 454229 00:09:28.068 10:17:05 accel_rpc -- common/autotest_common.sh@972 -- # wait 454229 00:09:28.634 00:09:28.634 real 0m1.936s 00:09:28.634 user 0m2.001s 00:09:28.634 sys 0m0.589s 00:09:28.634 10:17:05 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:28.634 10:17:05 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:28.634 ************************************ 00:09:28.634 END TEST accel_rpc 00:09:28.634 ************************************ 00:09:28.634 10:17:05 -- common/autotest_common.sh@1142 -- # return 0 00:09:28.634 10:17:05 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:09:28.634 10:17:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:28.634 10:17:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:28.634 10:17:05 -- common/autotest_common.sh@10 -- # set +x 00:09:28.634 ************************************ 00:09:28.634 START TEST app_cmdline 00:09:28.634 ************************************ 00:09:28.634 10:17:05 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:09:28.634 * Looking for test storage... 00:09:28.634 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:28.634 10:17:05 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:09:28.634 10:17:05 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=454558 00:09:28.634 10:17:05 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 454558 00:09:28.634 10:17:05 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:09:28.634 10:17:05 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 454558 ']' 00:09:28.634 10:17:05 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:28.634 10:17:05 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:28.634 10:17:05 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:28.634 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:28.634 10:17:05 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:28.634 10:17:05 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:28.892 [2024-07-15 10:17:05.839281] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:28.892 [2024-07-15 10:17:05.839358] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid454558 ] 00:09:28.892 [2024-07-15 10:17:05.970666] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.892 [2024-07-15 10:17:06.070434] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.830 10:17:06 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:29.830 10:17:06 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:09:29.830 10:17:06 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:09:29.830 { 00:09:29.830 "version": "SPDK v24.09-pre git sha1 719d03c6a", 00:09:29.830 "fields": { 00:09:29.830 "major": 24, 00:09:29.830 "minor": 9, 00:09:29.830 "patch": 0, 00:09:29.830 "suffix": "-pre", 00:09:29.830 "commit": "719d03c6a" 00:09:29.830 } 00:09:29.830 } 00:09:29.830 10:17:07 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:09:29.830 10:17:07 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:09:29.830 10:17:07 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:09:29.830 10:17:07 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:09:29.830 10:17:07 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:09:29.830 10:17:07 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:29.830 10:17:07 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:29.830 10:17:07 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:09:29.830 10:17:07 app_cmdline -- app/cmdline.sh@26 -- # sort 00:09:29.830 10:17:07 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:30.087 10:17:07 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:09:30.087 10:17:07 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:09:30.087 10:17:07 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:30.087 10:17:07 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:09:30.087 10:17:07 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:30.087 10:17:07 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:30.087 10:17:07 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:30.087 10:17:07 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:30.088 request: 00:09:30.088 { 00:09:30.088 "method": "env_dpdk_get_mem_stats", 00:09:30.088 "req_id": 1 00:09:30.088 } 00:09:30.088 Got JSON-RPC error response 00:09:30.088 response: 00:09:30.088 { 00:09:30.088 "code": -32601, 00:09:30.088 "message": "Method not found" 00:09:30.088 } 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:30.088 10:17:07 app_cmdline -- app/cmdline.sh@1 -- # killprocess 454558 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 454558 ']' 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 454558 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 454558 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 454558' 00:09:30.088 killing process with pid 454558 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@967 -- # kill 454558 00:09:30.088 10:17:07 app_cmdline -- common/autotest_common.sh@972 -- # wait 454558 00:09:30.654 00:09:30.654 real 0m1.976s 00:09:30.654 user 0m2.332s 00:09:30.654 sys 0m0.592s 00:09:30.654 10:17:07 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.655 10:17:07 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:30.655 ************************************ 00:09:30.655 END TEST app_cmdline 00:09:30.655 ************************************ 00:09:30.655 10:17:07 -- common/autotest_common.sh@1142 -- # return 0 00:09:30.655 10:17:07 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:30.655 10:17:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:30.655 10:17:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:30.655 10:17:07 -- common/autotest_common.sh@10 -- # set +x 00:09:30.655 ************************************ 00:09:30.655 START TEST version 00:09:30.655 ************************************ 00:09:30.655 10:17:07 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:30.655 * Looking for test storage... 00:09:30.655 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:30.655 10:17:07 version -- app/version.sh@17 -- # get_header_version major 00:09:30.655 10:17:07 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:30.655 10:17:07 version -- app/version.sh@14 -- # cut -f2 00:09:30.655 10:17:07 version -- app/version.sh@14 -- # tr -d '"' 00:09:30.655 10:17:07 version -- app/version.sh@17 -- # major=24 00:09:30.655 10:17:07 version -- app/version.sh@18 -- # get_header_version minor 00:09:30.655 10:17:07 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:30.655 10:17:07 version -- app/version.sh@14 -- # cut -f2 00:09:30.655 10:17:07 version -- app/version.sh@14 -- # tr -d '"' 00:09:30.655 10:17:07 version -- app/version.sh@18 -- # minor=9 00:09:30.914 10:17:07 version -- app/version.sh@19 -- # get_header_version patch 00:09:30.914 10:17:07 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:30.914 10:17:07 version -- app/version.sh@14 -- # cut -f2 00:09:30.914 10:17:07 version -- app/version.sh@14 -- # tr -d '"' 00:09:30.914 10:17:07 version -- app/version.sh@19 -- # patch=0 00:09:30.914 10:17:07 version -- app/version.sh@20 -- # get_header_version suffix 00:09:30.914 10:17:07 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:30.914 10:17:07 version -- app/version.sh@14 -- # cut -f2 00:09:30.914 10:17:07 version -- app/version.sh@14 -- # tr -d '"' 00:09:30.914 10:17:07 version -- app/version.sh@20 -- # suffix=-pre 00:09:30.914 10:17:07 version -- app/version.sh@22 -- # version=24.9 00:09:30.914 10:17:07 version -- app/version.sh@25 -- # (( patch != 0 )) 00:09:30.914 10:17:07 version -- app/version.sh@28 -- # version=24.9rc0 00:09:30.914 10:17:07 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:09:30.914 10:17:07 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:30.914 10:17:07 version -- app/version.sh@30 -- # py_version=24.9rc0 00:09:30.914 10:17:07 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:09:30.914 00:09:30.914 real 0m0.188s 00:09:30.914 user 0m0.099s 00:09:30.914 sys 0m0.133s 00:09:30.914 10:17:07 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.914 10:17:07 version -- common/autotest_common.sh@10 -- # set +x 00:09:30.914 ************************************ 00:09:30.914 END TEST version 00:09:30.914 ************************************ 00:09:30.914 10:17:07 -- common/autotest_common.sh@1142 -- # return 0 00:09:30.914 10:17:07 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:09:30.914 10:17:07 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:30.914 10:17:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:30.914 10:17:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:30.914 10:17:07 -- common/autotest_common.sh@10 -- # set +x 00:09:30.914 ************************************ 00:09:30.914 START TEST blockdev_general 00:09:30.914 ************************************ 00:09:30.914 10:17:07 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:30.914 * Looking for test storage... 00:09:30.914 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:30.914 10:17:08 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=454952 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:09:30.914 10:17:08 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 454952 00:09:30.914 10:17:08 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 454952 ']' 00:09:30.914 10:17:08 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:30.914 10:17:08 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:30.914 10:17:08 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:30.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:30.914 10:17:08 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:30.914 10:17:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:31.173 [2024-07-15 10:17:08.185335] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:31.173 [2024-07-15 10:17:08.185430] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid454952 ] 00:09:31.173 [2024-07-15 10:17:08.332732] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:31.431 [2024-07-15 10:17:08.431780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:31.999 10:17:09 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:31.999 10:17:09 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:09:31.999 10:17:09 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:09:31.999 10:17:09 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:09:31.999 10:17:09 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:31.999 10:17:09 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:31.999 10:17:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.258 [2024-07-15 10:17:09.349800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:32.258 [2024-07-15 10:17:09.349852] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:32.258 00:09:32.258 [2024-07-15 10:17:09.357787] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:32.258 [2024-07-15 10:17:09.357814] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:32.258 00:09:32.258 Malloc0 00:09:32.258 Malloc1 00:09:32.258 Malloc2 00:09:32.258 Malloc3 00:09:32.258 Malloc4 00:09:32.258 Malloc5 00:09:32.517 Malloc6 00:09:32.517 Malloc7 00:09:32.517 Malloc8 00:09:32.517 Malloc9 00:09:32.517 [2024-07-15 10:17:09.506769] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:32.517 [2024-07-15 10:17:09.506821] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:32.517 [2024-07-15 10:17:09.506840] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfda350 00:09:32.517 [2024-07-15 10:17:09.506853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:32.517 [2024-07-15 10:17:09.508207] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:32.517 [2024-07-15 10:17:09.508234] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:32.517 TestPT 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:32.517 5000+0 records in 00:09:32.517 5000+0 records out 00:09:32.517 10240000 bytes (10 MB, 9.8 MiB) copied, 0.017329 s, 591 MB/s 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.517 AIO0 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:09:32.517 10:17:09 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:32.517 10:17:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:33.084 10:17:10 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:33.084 10:17:10 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:09:33.084 10:17:10 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:09:33.086 10:17:10 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "3461f003-5fff-4e36-9600-aba399031aad"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3461f003-5fff-4e36-9600-aba399031aad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "71fc0429-c921-5890-9d66-6252e8056378"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "71fc0429-c921-5890-9d66-6252e8056378",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "6d7ec95c-d02a-5881-88ce-91440244eeda"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6d7ec95c-d02a-5881-88ce-91440244eeda",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f938e1c6-b542-5322-b506-0a07e7343464"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f938e1c6-b542-5322-b506-0a07e7343464",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "6f5a0ec8-a7c7-577a-a7d4-46e42ef959b8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6f5a0ec8-a7c7-577a-a7d4-46e42ef959b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "3ed60577-73ce-5c69-8f6e-d03f6cc53c54"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3ed60577-73ce-5c69-8f6e-d03f6cc53c54",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ee3af2cd-94b0-5b3d-9c6b-d1733413033f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ee3af2cd-94b0-5b3d-9c6b-d1733413033f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "d76924a9-eedb-5d9a-807a-93ba44232012"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d76924a9-eedb-5d9a-807a-93ba44232012",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "28eb2431-937c-58f8-bd08-ea036f537dc1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "28eb2431-937c-58f8-bd08-ea036f537dc1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c7de46ca-b411-55ea-a124-a96fb26daf28"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c7de46ca-b411-55ea-a124-a96fb26daf28",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "ac841b17-f124-56a9-999a-eb69560d5bba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ac841b17-f124-56a9-999a-eb69560d5bba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "846f4182-c8e5-5d43-bcf1-f9bce6ad40a9"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "846f4182-c8e5-5d43-bcf1-f9bce6ad40a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "332bc8fe-5417-4df4-8758-a7250d8f94d5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "332bc8fe-5417-4df4-8758-a7250d8f94d5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "332bc8fe-5417-4df4-8758-a7250d8f94d5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "b97519f1-2e0e-45f3-bf16-2cbfd97cbf70",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "77286cc1-4075-4b8d-80d9-fc65a0d5f0cf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "9436d2b5-d694-433a-b884-d6d2fbaa74a2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "468838cc-ead3-4a3b-8e36-f3764b7af508",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "5bdcd4fc-a5d6-493f-a238-84c6d8921856"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5bdcd4fc-a5d6-493f-a238-84c6d8921856",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "5bdcd4fc-a5d6-493f-a238-84c6d8921856",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a92edafb-ca1b-449a-8bdf-754d7bbbac49",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "177908ca-10ed-4e60-b3fb-83e77ff90146",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5424c682-6f99-4005-b842-a43f323ef3e0"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5424c682-6f99-4005-b842-a43f323ef3e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:33.086 10:17:10 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:09:33.086 10:17:10 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:09:33.086 10:17:10 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:09:33.086 10:17:10 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 454952 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 454952 ']' 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 454952 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 454952 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 454952' 00:09:33.086 killing process with pid 454952 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@967 -- # kill 454952 00:09:33.086 10:17:10 blockdev_general -- common/autotest_common.sh@972 -- # wait 454952 00:09:33.727 10:17:10 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:33.727 10:17:10 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:33.727 10:17:10 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:33.727 10:17:10 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:33.727 10:17:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:33.727 ************************************ 00:09:33.727 START TEST bdev_hello_world 00:09:33.727 ************************************ 00:09:33.727 10:17:10 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:33.727 [2024-07-15 10:17:10.780547] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:33.727 [2024-07-15 10:17:10.780615] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid455332 ] 00:09:33.727 [2024-07-15 10:17:10.915615] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.985 [2024-07-15 10:17:11.018205] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.985 [2024-07-15 10:17:11.171055] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:33.985 [2024-07-15 10:17:11.171122] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:33.985 [2024-07-15 10:17:11.171137] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:33.985 [2024-07-15 10:17:11.179058] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:33.985 [2024-07-15 10:17:11.179085] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:34.242 [2024-07-15 10:17:11.187070] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:34.242 [2024-07-15 10:17:11.187096] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:34.242 [2024-07-15 10:17:11.260971] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:34.242 [2024-07-15 10:17:11.261028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:34.242 [2024-07-15 10:17:11.261046] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28f13c0 00:09:34.242 [2024-07-15 10:17:11.261059] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:34.242 [2024-07-15 10:17:11.262502] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:34.242 [2024-07-15 10:17:11.262532] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:34.242 [2024-07-15 10:17:11.408378] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:34.242 [2024-07-15 10:17:11.408450] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:34.242 [2024-07-15 10:17:11.408505] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:34.242 [2024-07-15 10:17:11.408584] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:34.242 [2024-07-15 10:17:11.408661] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:34.242 [2024-07-15 10:17:11.408692] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:34.242 [2024-07-15 10:17:11.408757] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:34.242 00:09:34.242 [2024-07-15 10:17:11.408797] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:34.807 00:09:34.807 real 0m1.035s 00:09:34.807 user 0m0.683s 00:09:34.807 sys 0m0.313s 00:09:34.807 10:17:11 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:34.807 10:17:11 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:34.807 ************************************ 00:09:34.807 END TEST bdev_hello_world 00:09:34.807 ************************************ 00:09:34.807 10:17:11 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:34.807 10:17:11 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:09:34.807 10:17:11 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:34.807 10:17:11 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:34.807 10:17:11 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:34.807 ************************************ 00:09:34.807 START TEST bdev_bounds 00:09:34.807 ************************************ 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=455527 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 455527' 00:09:34.807 Process bdevio pid: 455527 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 455527 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 455527 ']' 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:34.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:34.807 10:17:11 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:34.807 [2024-07-15 10:17:11.908033] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:34.807 [2024-07-15 10:17:11.908108] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid455527 ] 00:09:35.065 [2024-07-15 10:17:12.038872] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:35.065 [2024-07-15 10:17:12.141757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:35.065 [2024-07-15 10:17:12.141841] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:35.065 [2024-07-15 10:17:12.141846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.323 [2024-07-15 10:17:12.302325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:35.323 [2024-07-15 10:17:12.302381] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:35.323 [2024-07-15 10:17:12.302397] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:35.323 [2024-07-15 10:17:12.310333] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:35.323 [2024-07-15 10:17:12.310360] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:35.323 [2024-07-15 10:17:12.318347] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:35.323 [2024-07-15 10:17:12.318373] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:35.323 [2024-07-15 10:17:12.395743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:35.323 [2024-07-15 10:17:12.395793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:35.323 [2024-07-15 10:17:12.395811] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x184d0c0 00:09:35.323 [2024-07-15 10:17:12.395824] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:35.323 [2024-07-15 10:17:12.397319] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:35.323 [2024-07-15 10:17:12.397350] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:35.890 10:17:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:35.890 10:17:12 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:09:35.890 10:17:12 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:35.890 I/O targets: 00:09:35.890 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:35.890 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:35.890 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:35.890 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:35.890 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:35.890 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:35.891 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:35.891 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:35.891 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:35.891 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:35.891 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:35.891 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:35.891 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:35.891 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:35.891 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:35.891 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:35.891 00:09:35.891 00:09:35.891 CUnit - A unit testing framework for C - Version 2.1-3 00:09:35.891 http://cunit.sourceforge.net/ 00:09:35.891 00:09:35.891 00:09:35.891 Suite: bdevio tests on: AIO0 00:09:35.891 Test: blockdev write read block ...passed 00:09:35.891 Test: blockdev write zeroes read block ...passed 00:09:35.891 Test: blockdev write zeroes read no split ...passed 00:09:35.891 Test: blockdev write zeroes read split ...passed 00:09:35.891 Test: blockdev write zeroes read split partial ...passed 00:09:35.891 Test: blockdev reset ...passed 00:09:35.891 Test: blockdev write read 8 blocks ...passed 00:09:35.891 Test: blockdev write read size > 128k ...passed 00:09:35.891 Test: blockdev write read invalid size ...passed 00:09:35.891 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:35.891 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:35.891 Test: blockdev write read max offset ...passed 00:09:35.891 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:35.891 Test: blockdev writev readv 8 blocks ...passed 00:09:35.891 Test: blockdev writev readv 30 x 1block ...passed 00:09:35.891 Test: blockdev writev readv block ...passed 00:09:35.891 Test: blockdev writev readv size > 128k ...passed 00:09:35.891 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:35.891 Test: blockdev comparev and writev ...passed 00:09:35.891 Test: blockdev nvme passthru rw ...passed 00:09:35.891 Test: blockdev nvme passthru vendor specific ...passed 00:09:35.891 Test: blockdev nvme admin passthru ...passed 00:09:35.891 Test: blockdev copy ...passed 00:09:35.891 Suite: bdevio tests on: raid1 00:09:35.891 Test: blockdev write read block ...passed 00:09:35.891 Test: blockdev write zeroes read block ...passed 00:09:35.891 Test: blockdev write zeroes read no split ...passed 00:09:35.891 Test: blockdev write zeroes read split ...passed 00:09:35.891 Test: blockdev write zeroes read split partial ...passed 00:09:35.891 Test: blockdev reset ...passed 00:09:35.891 Test: blockdev write read 8 blocks ...passed 00:09:35.891 Test: blockdev write read size > 128k ...passed 00:09:35.891 Test: blockdev write read invalid size ...passed 00:09:35.891 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:35.891 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:35.891 Test: blockdev write read max offset ...passed 00:09:35.891 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:35.891 Test: blockdev writev readv 8 blocks ...passed 00:09:35.891 Test: blockdev writev readv 30 x 1block ...passed 00:09:35.891 Test: blockdev writev readv block ...passed 00:09:35.891 Test: blockdev writev readv size > 128k ...passed 00:09:35.891 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:35.891 Test: blockdev comparev and writev ...passed 00:09:35.891 Test: blockdev nvme passthru rw ...passed 00:09:35.891 Test: blockdev nvme passthru vendor specific ...passed 00:09:35.891 Test: blockdev nvme admin passthru ...passed 00:09:35.891 Test: blockdev copy ...passed 00:09:35.891 Suite: bdevio tests on: concat0 00:09:35.891 Test: blockdev write read block ...passed 00:09:35.891 Test: blockdev write zeroes read block ...passed 00:09:35.891 Test: blockdev write zeroes read no split ...passed 00:09:35.891 Test: blockdev write zeroes read split ...passed 00:09:35.891 Test: blockdev write zeroes read split partial ...passed 00:09:35.891 Test: blockdev reset ...passed 00:09:35.891 Test: blockdev write read 8 blocks ...passed 00:09:35.891 Test: blockdev write read size > 128k ...passed 00:09:35.891 Test: blockdev write read invalid size ...passed 00:09:35.891 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:35.891 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:35.891 Test: blockdev write read max offset ...passed 00:09:35.891 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:35.891 Test: blockdev writev readv 8 blocks ...passed 00:09:35.891 Test: blockdev writev readv 30 x 1block ...passed 00:09:35.891 Test: blockdev writev readv block ...passed 00:09:35.891 Test: blockdev writev readv size > 128k ...passed 00:09:35.891 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:35.891 Test: blockdev comparev and writev ...passed 00:09:35.891 Test: blockdev nvme passthru rw ...passed 00:09:35.891 Test: blockdev nvme passthru vendor specific ...passed 00:09:35.891 Test: blockdev nvme admin passthru ...passed 00:09:35.891 Test: blockdev copy ...passed 00:09:35.891 Suite: bdevio tests on: raid0 00:09:35.891 Test: blockdev write read block ...passed 00:09:35.891 Test: blockdev write zeroes read block ...passed 00:09:35.891 Test: blockdev write zeroes read no split ...passed 00:09:35.891 Test: blockdev write zeroes read split ...passed 00:09:35.891 Test: blockdev write zeroes read split partial ...passed 00:09:35.891 Test: blockdev reset ...passed 00:09:35.891 Test: blockdev write read 8 blocks ...passed 00:09:35.891 Test: blockdev write read size > 128k ...passed 00:09:35.891 Test: blockdev write read invalid size ...passed 00:09:35.891 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:35.891 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:35.891 Test: blockdev write read max offset ...passed 00:09:35.891 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:35.891 Test: blockdev writev readv 8 blocks ...passed 00:09:35.891 Test: blockdev writev readv 30 x 1block ...passed 00:09:35.891 Test: blockdev writev readv block ...passed 00:09:35.891 Test: blockdev writev readv size > 128k ...passed 00:09:35.891 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:35.891 Test: blockdev comparev and writev ...passed 00:09:35.891 Test: blockdev nvme passthru rw ...passed 00:09:35.891 Test: blockdev nvme passthru vendor specific ...passed 00:09:35.891 Test: blockdev nvme admin passthru ...passed 00:09:35.891 Test: blockdev copy ...passed 00:09:35.891 Suite: bdevio tests on: TestPT 00:09:35.891 Test: blockdev write read block ...passed 00:09:35.891 Test: blockdev write zeroes read block ...passed 00:09:35.891 Test: blockdev write zeroes read no split ...passed 00:09:35.891 Test: blockdev write zeroes read split ...passed 00:09:35.891 Test: blockdev write zeroes read split partial ...passed 00:09:35.891 Test: blockdev reset ...passed 00:09:35.891 Test: blockdev write read 8 blocks ...passed 00:09:35.891 Test: blockdev write read size > 128k ...passed 00:09:35.891 Test: blockdev write read invalid size ...passed 00:09:35.891 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:35.891 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:35.891 Test: blockdev write read max offset ...passed 00:09:35.891 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:35.891 Test: blockdev writev readv 8 blocks ...passed 00:09:35.891 Test: blockdev writev readv 30 x 1block ...passed 00:09:35.891 Test: blockdev writev readv block ...passed 00:09:35.891 Test: blockdev writev readv size > 128k ...passed 00:09:35.891 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:35.891 Test: blockdev comparev and writev ...passed 00:09:35.891 Test: blockdev nvme passthru rw ...passed 00:09:35.891 Test: blockdev nvme passthru vendor specific ...passed 00:09:35.891 Test: blockdev nvme admin passthru ...passed 00:09:35.891 Test: blockdev copy ...passed 00:09:35.891 Suite: bdevio tests on: Malloc2p7 00:09:35.891 Test: blockdev write read block ...passed 00:09:35.891 Test: blockdev write zeroes read block ...passed 00:09:35.891 Test: blockdev write zeroes read no split ...passed 00:09:35.891 Test: blockdev write zeroes read split ...passed 00:09:35.891 Test: blockdev write zeroes read split partial ...passed 00:09:35.891 Test: blockdev reset ...passed 00:09:35.891 Test: blockdev write read 8 blocks ...passed 00:09:35.891 Test: blockdev write read size > 128k ...passed 00:09:35.891 Test: blockdev write read invalid size ...passed 00:09:35.891 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:35.891 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:35.891 Test: blockdev write read max offset ...passed 00:09:35.891 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:35.891 Test: blockdev writev readv 8 blocks ...passed 00:09:35.891 Test: blockdev writev readv 30 x 1block ...passed 00:09:35.891 Test: blockdev writev readv block ...passed 00:09:35.891 Test: blockdev writev readv size > 128k ...passed 00:09:35.891 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:35.891 Test: blockdev comparev and writev ...passed 00:09:35.891 Test: blockdev nvme passthru rw ...passed 00:09:35.891 Test: blockdev nvme passthru vendor specific ...passed 00:09:35.891 Test: blockdev nvme admin passthru ...passed 00:09:35.891 Test: blockdev copy ...passed 00:09:35.891 Suite: bdevio tests on: Malloc2p6 00:09:35.891 Test: blockdev write read block ...passed 00:09:35.891 Test: blockdev write zeroes read block ...passed 00:09:35.891 Test: blockdev write zeroes read no split ...passed 00:09:35.891 Test: blockdev write zeroes read split ...passed 00:09:36.151 Test: blockdev write zeroes read split partial ...passed 00:09:36.151 Test: blockdev reset ...passed 00:09:36.151 Test: blockdev write read 8 blocks ...passed 00:09:36.151 Test: blockdev write read size > 128k ...passed 00:09:36.151 Test: blockdev write read invalid size ...passed 00:09:36.151 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.151 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.151 Test: blockdev write read max offset ...passed 00:09:36.151 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.151 Test: blockdev writev readv 8 blocks ...passed 00:09:36.151 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.151 Test: blockdev writev readv block ...passed 00:09:36.151 Test: blockdev writev readv size > 128k ...passed 00:09:36.151 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.151 Test: blockdev comparev and writev ...passed 00:09:36.151 Test: blockdev nvme passthru rw ...passed 00:09:36.151 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.151 Test: blockdev nvme admin passthru ...passed 00:09:36.151 Test: blockdev copy ...passed 00:09:36.151 Suite: bdevio tests on: Malloc2p5 00:09:36.151 Test: blockdev write read block ...passed 00:09:36.151 Test: blockdev write zeroes read block ...passed 00:09:36.151 Test: blockdev write zeroes read no split ...passed 00:09:36.151 Test: blockdev write zeroes read split ...passed 00:09:36.151 Test: blockdev write zeroes read split partial ...passed 00:09:36.151 Test: blockdev reset ...passed 00:09:36.151 Test: blockdev write read 8 blocks ...passed 00:09:36.151 Test: blockdev write read size > 128k ...passed 00:09:36.151 Test: blockdev write read invalid size ...passed 00:09:36.151 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.151 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.151 Test: blockdev write read max offset ...passed 00:09:36.151 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.151 Test: blockdev writev readv 8 blocks ...passed 00:09:36.151 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.151 Test: blockdev writev readv block ...passed 00:09:36.151 Test: blockdev writev readv size > 128k ...passed 00:09:36.151 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.151 Test: blockdev comparev and writev ...passed 00:09:36.151 Test: blockdev nvme passthru rw ...passed 00:09:36.151 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.151 Test: blockdev nvme admin passthru ...passed 00:09:36.151 Test: blockdev copy ...passed 00:09:36.151 Suite: bdevio tests on: Malloc2p4 00:09:36.151 Test: blockdev write read block ...passed 00:09:36.151 Test: blockdev write zeroes read block ...passed 00:09:36.151 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 Suite: bdevio tests on: Malloc2p3 00:09:36.152 Test: blockdev write read block ...passed 00:09:36.152 Test: blockdev write zeroes read block ...passed 00:09:36.152 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 Suite: bdevio tests on: Malloc2p2 00:09:36.152 Test: blockdev write read block ...passed 00:09:36.152 Test: blockdev write zeroes read block ...passed 00:09:36.152 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 Suite: bdevio tests on: Malloc2p1 00:09:36.152 Test: blockdev write read block ...passed 00:09:36.152 Test: blockdev write zeroes read block ...passed 00:09:36.152 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 Suite: bdevio tests on: Malloc2p0 00:09:36.152 Test: blockdev write read block ...passed 00:09:36.152 Test: blockdev write zeroes read block ...passed 00:09:36.152 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 Suite: bdevio tests on: Malloc1p1 00:09:36.152 Test: blockdev write read block ...passed 00:09:36.152 Test: blockdev write zeroes read block ...passed 00:09:36.152 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 Suite: bdevio tests on: Malloc1p0 00:09:36.152 Test: blockdev write read block ...passed 00:09:36.152 Test: blockdev write zeroes read block ...passed 00:09:36.152 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 Suite: bdevio tests on: Malloc0 00:09:36.152 Test: blockdev write read block ...passed 00:09:36.152 Test: blockdev write zeroes read block ...passed 00:09:36.152 Test: blockdev write zeroes read no split ...passed 00:09:36.152 Test: blockdev write zeroes read split ...passed 00:09:36.152 Test: blockdev write zeroes read split partial ...passed 00:09:36.152 Test: blockdev reset ...passed 00:09:36.152 Test: blockdev write read 8 blocks ...passed 00:09:36.152 Test: blockdev write read size > 128k ...passed 00:09:36.152 Test: blockdev write read invalid size ...passed 00:09:36.152 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:36.152 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:36.152 Test: blockdev write read max offset ...passed 00:09:36.152 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:36.152 Test: blockdev writev readv 8 blocks ...passed 00:09:36.152 Test: blockdev writev readv 30 x 1block ...passed 00:09:36.152 Test: blockdev writev readv block ...passed 00:09:36.152 Test: blockdev writev readv size > 128k ...passed 00:09:36.152 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:36.152 Test: blockdev comparev and writev ...passed 00:09:36.152 Test: blockdev nvme passthru rw ...passed 00:09:36.152 Test: blockdev nvme passthru vendor specific ...passed 00:09:36.152 Test: blockdev nvme admin passthru ...passed 00:09:36.152 Test: blockdev copy ...passed 00:09:36.152 00:09:36.152 Run Summary: Type Total Ran Passed Failed Inactive 00:09:36.153 suites 16 16 n/a 0 0 00:09:36.153 tests 368 368 368 0 0 00:09:36.153 asserts 2224 2224 2224 0 n/a 00:09:36.153 00:09:36.153 Elapsed time = 0.505 seconds 00:09:36.153 0 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 455527 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 455527 ']' 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 455527 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 455527 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 455527' 00:09:36.153 killing process with pid 455527 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 455527 00:09:36.153 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 455527 00:09:36.410 10:17:13 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:09:36.410 00:09:36.410 real 0m1.716s 00:09:36.410 user 0m4.234s 00:09:36.410 sys 0m0.516s 00:09:36.410 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:36.410 10:17:13 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:36.410 ************************************ 00:09:36.410 END TEST bdev_bounds 00:09:36.410 ************************************ 00:09:36.410 10:17:13 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:36.410 10:17:13 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:36.410 10:17:13 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:36.410 10:17:13 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:36.410 10:17:13 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:36.668 ************************************ 00:09:36.668 START TEST bdev_nbd 00:09:36.668 ************************************ 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:36.668 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=455744 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 455744 /var/tmp/spdk-nbd.sock 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 455744 ']' 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:36.669 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:36.669 10:17:13 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:36.669 [2024-07-15 10:17:13.706775] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:09:36.669 [2024-07-15 10:17:13.706838] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:36.669 [2024-07-15 10:17:13.838573] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:36.926 [2024-07-15 10:17:13.945082] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.926 [2024-07-15 10:17:14.106796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:36.926 [2024-07-15 10:17:14.106858] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:36.926 [2024-07-15 10:17:14.106873] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:36.926 [2024-07-15 10:17:14.114802] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:36.926 [2024-07-15 10:17:14.114828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:36.926 [2024-07-15 10:17:14.122815] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:36.926 [2024-07-15 10:17:14.122841] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:37.184 [2024-07-15 10:17:14.199803] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:37.184 [2024-07-15 10:17:14.199852] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:37.184 [2024-07-15 10:17:14.199869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1427a40 00:09:37.184 [2024-07-15 10:17:14.199882] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:37.184 [2024-07-15 10:17:14.201317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:37.184 [2024-07-15 10:17:14.201346] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:37.487 10:17:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:37.487 10:17:14 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:09:37.487 10:17:14 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:37.487 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:37.488 10:17:14 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:38.054 1+0 records in 00:09:38.054 1+0 records out 00:09:38.054 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218227 s, 18.8 MB/s 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:38.054 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:38.314 1+0 records in 00:09:38.314 1+0 records out 00:09:38.314 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285769 s, 14.3 MB/s 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:38.314 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:38.574 1+0 records in 00:09:38.574 1+0 records out 00:09:38.574 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314232 s, 13.0 MB/s 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:38.574 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:38.833 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:38.833 10:17:15 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:38.833 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:38.833 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:38.833 10:17:15 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:38.833 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:39.092 1+0 records in 00:09:39.092 1+0 records out 00:09:39.092 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359047 s, 11.4 MB/s 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:39.092 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:39.350 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:39.351 1+0 records in 00:09:39.351 1+0 records out 00:09:39.351 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000352257 s, 11.6 MB/s 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:39.351 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:39.609 1+0 records in 00:09:39.609 1+0 records out 00:09:39.609 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000360922 s, 11.3 MB/s 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:39.609 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:39.868 1+0 records in 00:09:39.868 1+0 records out 00:09:39.868 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299671 s, 13.7 MB/s 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:39.868 10:17:16 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:09:40.126 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:09:40.126 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:09:40.126 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:09:40.126 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:40.127 1+0 records in 00:09:40.127 1+0 records out 00:09:40.127 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00044011 s, 9.3 MB/s 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:40.127 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:40.386 1+0 records in 00:09:40.386 1+0 records out 00:09:40.386 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000454992 s, 9.0 MB/s 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:40.386 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:40.645 1+0 records in 00:09:40.645 1+0 records out 00:09:40.645 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000537783 s, 7.6 MB/s 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:40.645 10:17:17 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:40.904 1+0 records in 00:09:40.904 1+0 records out 00:09:40.904 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000460249 s, 8.9 MB/s 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:40.904 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:41.163 1+0 records in 00:09:41.163 1+0 records out 00:09:41.163 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000686139 s, 6.0 MB/s 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:41.163 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:41.421 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:41.422 1+0 records in 00:09:41.422 1+0 records out 00:09:41.422 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000530083 s, 7.7 MB/s 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:41.422 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:41.680 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:41.680 1+0 records in 00:09:41.680 1+0 records out 00:09:41.680 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00068059 s, 6.0 MB/s 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:41.938 10:17:18 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:42.197 1+0 records in 00:09:42.197 1+0 records out 00:09:42.197 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000626312 s, 6.5 MB/s 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:42.197 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:42.456 1+0 records in 00:09:42.456 1+0 records out 00:09:42.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000823619 s, 5.0 MB/s 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:42.456 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:42.720 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd0", 00:09:42.720 "bdev_name": "Malloc0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd1", 00:09:42.720 "bdev_name": "Malloc1p0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd2", 00:09:42.720 "bdev_name": "Malloc1p1" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd3", 00:09:42.720 "bdev_name": "Malloc2p0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd4", 00:09:42.720 "bdev_name": "Malloc2p1" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd5", 00:09:42.720 "bdev_name": "Malloc2p2" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd6", 00:09:42.720 "bdev_name": "Malloc2p3" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd7", 00:09:42.720 "bdev_name": "Malloc2p4" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd8", 00:09:42.720 "bdev_name": "Malloc2p5" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd9", 00:09:42.720 "bdev_name": "Malloc2p6" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd10", 00:09:42.720 "bdev_name": "Malloc2p7" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd11", 00:09:42.720 "bdev_name": "TestPT" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd12", 00:09:42.720 "bdev_name": "raid0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd13", 00:09:42.720 "bdev_name": "concat0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd14", 00:09:42.720 "bdev_name": "raid1" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd15", 00:09:42.720 "bdev_name": "AIO0" 00:09:42.720 } 00:09:42.720 ]' 00:09:42.720 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:09:42.720 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd0", 00:09:42.720 "bdev_name": "Malloc0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd1", 00:09:42.720 "bdev_name": "Malloc1p0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd2", 00:09:42.720 "bdev_name": "Malloc1p1" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd3", 00:09:42.720 "bdev_name": "Malloc2p0" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd4", 00:09:42.720 "bdev_name": "Malloc2p1" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd5", 00:09:42.720 "bdev_name": "Malloc2p2" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd6", 00:09:42.720 "bdev_name": "Malloc2p3" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd7", 00:09:42.720 "bdev_name": "Malloc2p4" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd8", 00:09:42.720 "bdev_name": "Malloc2p5" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd9", 00:09:42.720 "bdev_name": "Malloc2p6" 00:09:42.720 }, 00:09:42.720 { 00:09:42.720 "nbd_device": "/dev/nbd10", 00:09:42.720 "bdev_name": "Malloc2p7" 00:09:42.720 }, 00:09:42.720 { 00:09:42.721 "nbd_device": "/dev/nbd11", 00:09:42.721 "bdev_name": "TestPT" 00:09:42.721 }, 00:09:42.721 { 00:09:42.721 "nbd_device": "/dev/nbd12", 00:09:42.721 "bdev_name": "raid0" 00:09:42.721 }, 00:09:42.721 { 00:09:42.721 "nbd_device": "/dev/nbd13", 00:09:42.721 "bdev_name": "concat0" 00:09:42.721 }, 00:09:42.721 { 00:09:42.721 "nbd_device": "/dev/nbd14", 00:09:42.721 "bdev_name": "raid1" 00:09:42.721 }, 00:09:42.721 { 00:09:42.721 "nbd_device": "/dev/nbd15", 00:09:42.721 "bdev_name": "AIO0" 00:09:42.721 } 00:09:42.721 ]' 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:42.721 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:42.979 10:17:19 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:43.237 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:43.237 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:43.237 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:43.237 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:43.237 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:43.238 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:43.238 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:43.238 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:43.238 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:43.238 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:43.496 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:43.755 10:17:20 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:44.013 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:44.271 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:44.530 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:44.788 10:17:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:45.046 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:45.304 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:45.304 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:45.304 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:45.305 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:45.305 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:45.305 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:45.305 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:45.305 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:45.305 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:45.563 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:45.821 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:45.822 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:45.822 10:17:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:46.080 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:46.081 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:46.081 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:46.338 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:46.631 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:46.923 10:17:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:09:47.181 10:17:24 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:47.182 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:47.466 /dev/nbd0 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:47.466 1+0 records in 00:09:47.466 1+0 records out 00:09:47.466 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000140248 s, 29.2 MB/s 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:47.466 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:09:47.724 /dev/nbd1 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:47.724 1+0 records in 00:09:47.724 1+0 records out 00:09:47.724 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283426 s, 14.5 MB/s 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:47.724 10:17:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:09:47.982 /dev/nbd10 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:47.982 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:47.982 1+0 records in 00:09:47.982 1+0 records out 00:09:47.982 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301408 s, 13.6 MB/s 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:47.983 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:09:48.241 /dev/nbd11 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:48.241 1+0 records in 00:09:48.241 1+0 records out 00:09:48.241 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000330826 s, 12.4 MB/s 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:48.241 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:48.499 /dev/nbd12 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:48.499 1+0 records in 00:09:48.499 1+0 records out 00:09:48.499 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000343384 s, 11.9 MB/s 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:48.499 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:48.757 /dev/nbd13 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:48.757 1+0 records in 00:09:48.757 1+0 records out 00:09:48.757 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000403758 s, 10.1 MB/s 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:48.757 10:17:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:49.016 /dev/nbd14 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.016 1+0 records in 00:09:49.016 1+0 records out 00:09:49.016 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000488957 s, 8.4 MB/s 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:49.016 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:49.274 /dev/nbd15 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.274 1+0 records in 00:09:49.274 1+0 records out 00:09:49.274 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00044152 s, 9.3 MB/s 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:49.274 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:49.532 /dev/nbd2 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.532 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.532 1+0 records in 00:09:49.532 1+0 records out 00:09:49.532 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000557517 s, 7.3 MB/s 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:49.533 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:49.791 /dev/nbd3 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:49.791 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:49.791 1+0 records in 00:09:49.791 1+0 records out 00:09:49.791 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000477684 s, 8.6 MB/s 00:09:50.049 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.049 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.049 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.049 10:17:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.049 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.049 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:50.049 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:50.049 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:50.049 /dev/nbd4 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:50.308 1+0 records in 00:09:50.308 1+0 records out 00:09:50.308 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000460699 s, 8.9 MB/s 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:50.308 /dev/nbd5 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:50.308 1+0 records in 00:09:50.308 1+0 records out 00:09:50.308 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000659749 s, 6.2 MB/s 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.308 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:50.567 /dev/nbd6 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:50.567 1+0 records in 00:09:50.567 1+0 records out 00:09:50.567 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000777636 s, 5.3 MB/s 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:50.567 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:50.826 /dev/nbd7 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:50.826 1+0 records in 00:09:50.826 1+0 records out 00:09:50.826 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000789156 s, 5.2 MB/s 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:50.826 10:17:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:51.084 /dev/nbd8 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:51.084 1+0 records in 00:09:51.084 1+0 records out 00:09:51.084 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000842632 s, 4.9 MB/s 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:51.084 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:51.342 /dev/nbd9 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:51.342 1+0 records in 00:09:51.342 1+0 records out 00:09:51.342 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000580841 s, 7.1 MB/s 00:09:51.342 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:51.600 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:51.600 { 00:09:51.600 "nbd_device": "/dev/nbd0", 00:09:51.600 "bdev_name": "Malloc0" 00:09:51.600 }, 00:09:51.600 { 00:09:51.600 "nbd_device": "/dev/nbd1", 00:09:51.600 "bdev_name": "Malloc1p0" 00:09:51.600 }, 00:09:51.600 { 00:09:51.601 "nbd_device": "/dev/nbd10", 00:09:51.601 "bdev_name": "Malloc1p1" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd11", 00:09:51.601 "bdev_name": "Malloc2p0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd12", 00:09:51.601 "bdev_name": "Malloc2p1" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd13", 00:09:51.601 "bdev_name": "Malloc2p2" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd14", 00:09:51.601 "bdev_name": "Malloc2p3" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd15", 00:09:51.601 "bdev_name": "Malloc2p4" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd2", 00:09:51.601 "bdev_name": "Malloc2p5" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd3", 00:09:51.601 "bdev_name": "Malloc2p6" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd4", 00:09:51.601 "bdev_name": "Malloc2p7" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd5", 00:09:51.601 "bdev_name": "TestPT" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd6", 00:09:51.601 "bdev_name": "raid0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd7", 00:09:51.601 "bdev_name": "concat0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd8", 00:09:51.601 "bdev_name": "raid1" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd9", 00:09:51.601 "bdev_name": "AIO0" 00:09:51.601 } 00:09:51.601 ]' 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd0", 00:09:51.601 "bdev_name": "Malloc0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd1", 00:09:51.601 "bdev_name": "Malloc1p0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd10", 00:09:51.601 "bdev_name": "Malloc1p1" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd11", 00:09:51.601 "bdev_name": "Malloc2p0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd12", 00:09:51.601 "bdev_name": "Malloc2p1" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd13", 00:09:51.601 "bdev_name": "Malloc2p2" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd14", 00:09:51.601 "bdev_name": "Malloc2p3" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd15", 00:09:51.601 "bdev_name": "Malloc2p4" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd2", 00:09:51.601 "bdev_name": "Malloc2p5" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd3", 00:09:51.601 "bdev_name": "Malloc2p6" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd4", 00:09:51.601 "bdev_name": "Malloc2p7" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd5", 00:09:51.601 "bdev_name": "TestPT" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd6", 00:09:51.601 "bdev_name": "raid0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd7", 00:09:51.601 "bdev_name": "concat0" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd8", 00:09:51.601 "bdev_name": "raid1" 00:09:51.601 }, 00:09:51.601 { 00:09:51.601 "nbd_device": "/dev/nbd9", 00:09:51.601 "bdev_name": "AIO0" 00:09:51.601 } 00:09:51.601 ]' 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:51.601 /dev/nbd1 00:09:51.601 /dev/nbd10 00:09:51.601 /dev/nbd11 00:09:51.601 /dev/nbd12 00:09:51.601 /dev/nbd13 00:09:51.601 /dev/nbd14 00:09:51.601 /dev/nbd15 00:09:51.601 /dev/nbd2 00:09:51.601 /dev/nbd3 00:09:51.601 /dev/nbd4 00:09:51.601 /dev/nbd5 00:09:51.601 /dev/nbd6 00:09:51.601 /dev/nbd7 00:09:51.601 /dev/nbd8 00:09:51.601 /dev/nbd9' 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:51.601 /dev/nbd1 00:09:51.601 /dev/nbd10 00:09:51.601 /dev/nbd11 00:09:51.601 /dev/nbd12 00:09:51.601 /dev/nbd13 00:09:51.601 /dev/nbd14 00:09:51.601 /dev/nbd15 00:09:51.601 /dev/nbd2 00:09:51.601 /dev/nbd3 00:09:51.601 /dev/nbd4 00:09:51.601 /dev/nbd5 00:09:51.601 /dev/nbd6 00:09:51.601 /dev/nbd7 00:09:51.601 /dev/nbd8 00:09:51.601 /dev/nbd9' 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:51.601 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:51.860 256+0 records in 00:09:51.860 256+0 records out 00:09:51.860 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102627 s, 102 MB/s 00:09:51.860 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:51.860 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:51.860 256+0 records in 00:09:51.860 256+0 records out 00:09:51.860 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.180878 s, 5.8 MB/s 00:09:51.860 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:51.860 10:17:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:52.118 256+0 records in 00:09:52.118 256+0 records out 00:09:52.118 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.1841 s, 5.7 MB/s 00:09:52.118 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:52.118 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:52.118 256+0 records in 00:09:52.118 256+0 records out 00:09:52.118 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.118098 s, 8.9 MB/s 00:09:52.118 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:52.118 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:52.376 256+0 records in 00:09:52.376 256+0 records out 00:09:52.376 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183985 s, 5.7 MB/s 00:09:52.376 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:52.376 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:52.640 256+0 records in 00:09:52.640 256+0 records out 00:09:52.640 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0968835 s, 10.8 MB/s 00:09:52.640 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:52.640 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:52.640 256+0 records in 00:09:52.640 256+0 records out 00:09:52.640 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182814 s, 5.7 MB/s 00:09:52.640 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:52.640 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:52.900 256+0 records in 00:09:52.900 256+0 records out 00:09:52.900 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183215 s, 5.7 MB/s 00:09:52.900 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:52.900 10:17:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:52.900 256+0 records in 00:09:52.900 256+0 records out 00:09:52.900 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.114001 s, 9.2 MB/s 00:09:52.900 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:52.900 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:53.158 256+0 records in 00:09:53.158 256+0 records out 00:09:53.159 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.115612 s, 9.1 MB/s 00:09:53.159 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:53.159 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:53.417 256+0 records in 00:09:53.417 256+0 records out 00:09:53.417 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184032 s, 5.7 MB/s 00:09:53.417 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:53.417 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:53.417 256+0 records in 00:09:53.417 256+0 records out 00:09:53.417 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184039 s, 5.7 MB/s 00:09:53.417 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:53.417 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:53.676 256+0 records in 00:09:53.676 256+0 records out 00:09:53.676 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184065 s, 5.7 MB/s 00:09:53.676 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:53.676 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:53.934 256+0 records in 00:09:53.934 256+0 records out 00:09:53.934 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.164738 s, 6.4 MB/s 00:09:53.934 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:53.934 10:17:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:53.934 256+0 records in 00:09:53.934 256+0 records out 00:09:53.934 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.113431 s, 9.2 MB/s 00:09:53.934 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:53.934 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:54.193 256+0 records in 00:09:54.193 256+0 records out 00:09:54.193 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.110591 s, 9.5 MB/s 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:54.193 256+0 records in 00:09:54.193 256+0 records out 00:09:54.193 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.176494 s, 5.9 MB/s 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.193 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:54.453 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:54.712 10:17:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:54.970 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:55.229 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:55.488 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:55.746 10:17:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:56.006 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:56.264 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:56.264 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:56.264 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:56.265 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:56.265 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:56.265 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:56.265 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:56.265 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:56.265 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:56.265 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:56.523 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:56.523 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:56.523 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:56.523 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:56.523 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:56.523 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:56.524 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:56.524 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:56.524 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:56.524 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:56.782 10:17:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:57.040 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:57.299 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:57.299 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:57.299 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:57.299 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:57.299 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:57.299 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:57.558 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:57.558 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:57.558 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:57.558 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:57.816 10:17:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:58.074 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:58.333 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:58.592 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:58.851 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:58.851 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:58.851 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:58.851 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:58.851 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:58.852 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:58.852 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:58.852 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:58.852 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:58.852 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:58.852 10:17:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:59.111 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:59.370 malloc_lvol_verify 00:09:59.370 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:59.629 c628a713-02e6-46e8-a0f8-6befd95cf646 00:09:59.629 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:59.888 8e100c8e-cdcd-49e5-9551-f69740b0fa3d 00:09:59.888 10:17:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:10:00.147 /dev/nbd0 00:10:00.147 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:10:00.147 mke2fs 1.46.5 (30-Dec-2021) 00:10:00.147 Discarding device blocks: 0/4096 done 00:10:00.147 Creating filesystem with 4096 1k blocks and 1024 inodes 00:10:00.147 00:10:00.147 Allocating group tables: 0/1 done 00:10:00.147 Writing inode tables: 0/1 done 00:10:00.147 Creating journal (1024 blocks): done 00:10:00.147 Writing superblocks and filesystem accounting information: 0/1 done 00:10:00.147 00:10:00.147 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:10:00.147 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:10:00.147 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:00.147 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:00.148 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:00.148 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:00.148 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:00.148 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 455744 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 455744 ']' 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 455744 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 455744 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 455744' 00:10:00.409 killing process with pid 455744 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 455744 00:10:00.409 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 455744 00:10:00.718 10:17:37 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:10:00.718 00:10:00.718 real 0m24.133s 00:10:00.718 user 0m30.176s 00:10:00.718 sys 0m14.010s 00:10:00.718 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.718 10:17:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:00.718 ************************************ 00:10:00.718 END TEST bdev_nbd 00:10:00.718 ************************************ 00:10:00.718 10:17:37 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:00.718 10:17:37 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:10:00.718 10:17:37 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:10:00.718 10:17:37 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:10:00.718 10:17:37 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:10:00.718 10:17:37 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:00.718 10:17:37 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:00.718 10:17:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:00.718 ************************************ 00:10:00.718 START TEST bdev_fio 00:10:00.718 ************************************ 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:00.718 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:10:00.718 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:00.719 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:00.719 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:00.719 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:10:00.719 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:10:00.719 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:10:00.719 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:10:00.977 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:10:00.977 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:00.978 10:17:37 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:00.978 ************************************ 00:10:00.978 START TEST bdev_fio_rw_verify 00:10:00.978 ************************************ 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:10:00.978 10:17:37 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:00.978 10:17:38 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:01.237 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:01.237 fio-3.35 00:10:01.237 Starting 16 threads 00:10:13.444 00:10:13.444 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=459748: Mon Jul 15 10:17:49 2024 00:10:13.444 read: IOPS=92.5k, BW=361MiB/s (379MB/s)(3614MiB/10001msec) 00:10:13.444 slat (usec): min=2, max=225, avg=34.11, stdev=14.60 00:10:13.444 clat (usec): min=12, max=4943, avg=280.81, stdev=134.23 00:10:13.444 lat (usec): min=23, max=5010, avg=314.92, stdev=142.72 00:10:13.444 clat percentiles (usec): 00:10:13.444 | 50.000th=[ 269], 99.000th=[ 586], 99.900th=[ 676], 99.990th=[ 889], 00:10:13.444 | 99.999th=[ 1336] 00:10:13.444 write: IOPS=144k, BW=564MiB/s (591MB/s)(5571MiB/9879msec); 0 zone resets 00:10:13.444 slat (usec): min=5, max=1199, avg=48.06, stdev=15.57 00:10:13.444 clat (usec): min=12, max=1754, avg=336.05, stdev=157.65 00:10:13.444 lat (usec): min=32, max=1831, avg=384.11, stdev=166.11 00:10:13.444 clat percentiles (usec): 00:10:13.444 | 50.000th=[ 318], 99.000th=[ 734], 99.900th=[ 906], 99.990th=[ 1020], 00:10:13.444 | 99.999th=[ 1418] 00:10:13.444 bw ( KiB/s): min=454593, max=757191, per=99.58%, avg=575030.37, stdev=5938.22, samples=304 00:10:13.444 iops : min=113646, max=189296, avg=143755.53, stdev=1484.51, samples=304 00:10:13.444 lat (usec) : 20=0.01%, 50=0.63%, 100=4.91%, 250=32.81%, 500=49.08% 00:10:13.444 lat (usec) : 750=12.00%, 1000=0.55% 00:10:13.444 lat (msec) : 2=0.01%, 10=0.01% 00:10:13.444 cpu : usr=99.22%, sys=0.38%, ctx=645, majf=0, minf=2615 00:10:13.444 IO depths : 1=12.4%, 2=24.9%, 4=50.2%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:13.444 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:13.444 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:13.444 issued rwts: total=925165,1426103,0,0 short=0,0,0,0 dropped=0,0,0,0 00:10:13.444 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:13.444 00:10:13.444 Run status group 0 (all jobs): 00:10:13.444 READ: bw=361MiB/s (379MB/s), 361MiB/s-361MiB/s (379MB/s-379MB/s), io=3614MiB (3789MB), run=10001-10001msec 00:10:13.444 WRITE: bw=564MiB/s (591MB/s), 564MiB/s-564MiB/s (591MB/s-591MB/s), io=5571MiB (5841MB), run=9879-9879msec 00:10:13.444 00:10:13.444 real 0m12.277s 00:10:13.444 user 2m45.668s 00:10:13.444 sys 0m1.402s 00:10:13.444 10:17:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:13.444 10:17:50 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:10:13.444 ************************************ 00:10:13.444 END TEST bdev_fio_rw_verify 00:10:13.444 ************************************ 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:10:13.444 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:13.446 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "3461f003-5fff-4e36-9600-aba399031aad"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3461f003-5fff-4e36-9600-aba399031aad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "71fc0429-c921-5890-9d66-6252e8056378"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "71fc0429-c921-5890-9d66-6252e8056378",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "6d7ec95c-d02a-5881-88ce-91440244eeda"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6d7ec95c-d02a-5881-88ce-91440244eeda",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f938e1c6-b542-5322-b506-0a07e7343464"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f938e1c6-b542-5322-b506-0a07e7343464",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "6f5a0ec8-a7c7-577a-a7d4-46e42ef959b8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6f5a0ec8-a7c7-577a-a7d4-46e42ef959b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "3ed60577-73ce-5c69-8f6e-d03f6cc53c54"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3ed60577-73ce-5c69-8f6e-d03f6cc53c54",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ee3af2cd-94b0-5b3d-9c6b-d1733413033f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ee3af2cd-94b0-5b3d-9c6b-d1733413033f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "d76924a9-eedb-5d9a-807a-93ba44232012"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d76924a9-eedb-5d9a-807a-93ba44232012",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "28eb2431-937c-58f8-bd08-ea036f537dc1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "28eb2431-937c-58f8-bd08-ea036f537dc1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c7de46ca-b411-55ea-a124-a96fb26daf28"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c7de46ca-b411-55ea-a124-a96fb26daf28",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "ac841b17-f124-56a9-999a-eb69560d5bba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ac841b17-f124-56a9-999a-eb69560d5bba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "846f4182-c8e5-5d43-bcf1-f9bce6ad40a9"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "846f4182-c8e5-5d43-bcf1-f9bce6ad40a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "332bc8fe-5417-4df4-8758-a7250d8f94d5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "332bc8fe-5417-4df4-8758-a7250d8f94d5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "332bc8fe-5417-4df4-8758-a7250d8f94d5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "b97519f1-2e0e-45f3-bf16-2cbfd97cbf70",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "77286cc1-4075-4b8d-80d9-fc65a0d5f0cf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "9436d2b5-d694-433a-b884-d6d2fbaa74a2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "468838cc-ead3-4a3b-8e36-f3764b7af508",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "5bdcd4fc-a5d6-493f-a238-84c6d8921856"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5bdcd4fc-a5d6-493f-a238-84c6d8921856",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "5bdcd4fc-a5d6-493f-a238-84c6d8921856",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a92edafb-ca1b-449a-8bdf-754d7bbbac49",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "177908ca-10ed-4e60-b3fb-83e77ff90146",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5424c682-6f99-4005-b842-a43f323ef3e0"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5424c682-6f99-4005-b842-a43f323ef3e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:13.446 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:10:13.446 Malloc1p0 00:10:13.446 Malloc1p1 00:10:13.446 Malloc2p0 00:10:13.446 Malloc2p1 00:10:13.446 Malloc2p2 00:10:13.446 Malloc2p3 00:10:13.446 Malloc2p4 00:10:13.446 Malloc2p5 00:10:13.446 Malloc2p6 00:10:13.446 Malloc2p7 00:10:13.446 TestPT 00:10:13.446 raid0 00:10:13.446 concat0 ]] 00:10:13.446 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "3461f003-5fff-4e36-9600-aba399031aad"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "3461f003-5fff-4e36-9600-aba399031aad",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "71fc0429-c921-5890-9d66-6252e8056378"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "71fc0429-c921-5890-9d66-6252e8056378",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "6d7ec95c-d02a-5881-88ce-91440244eeda"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6d7ec95c-d02a-5881-88ce-91440244eeda",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f938e1c6-b542-5322-b506-0a07e7343464"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f938e1c6-b542-5322-b506-0a07e7343464",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "6f5a0ec8-a7c7-577a-a7d4-46e42ef959b8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "6f5a0ec8-a7c7-577a-a7d4-46e42ef959b8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "3ed60577-73ce-5c69-8f6e-d03f6cc53c54"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "3ed60577-73ce-5c69-8f6e-d03f6cc53c54",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "ee3af2cd-94b0-5b3d-9c6b-d1733413033f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ee3af2cd-94b0-5b3d-9c6b-d1733413033f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "d76924a9-eedb-5d9a-807a-93ba44232012"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d76924a9-eedb-5d9a-807a-93ba44232012",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "28eb2431-937c-58f8-bd08-ea036f537dc1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "28eb2431-937c-58f8-bd08-ea036f537dc1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "c7de46ca-b411-55ea-a124-a96fb26daf28"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c7de46ca-b411-55ea-a124-a96fb26daf28",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "ac841b17-f124-56a9-999a-eb69560d5bba"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "ac841b17-f124-56a9-999a-eb69560d5bba",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "846f4182-c8e5-5d43-bcf1-f9bce6ad40a9"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "846f4182-c8e5-5d43-bcf1-f9bce6ad40a9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "332bc8fe-5417-4df4-8758-a7250d8f94d5"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "332bc8fe-5417-4df4-8758-a7250d8f94d5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "332bc8fe-5417-4df4-8758-a7250d8f94d5",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "b97519f1-2e0e-45f3-bf16-2cbfd97cbf70",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "77286cc1-4075-4b8d-80d9-fc65a0d5f0cf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "1a5a7f03-0259-41dd-b8e1-770dfbe6ac59",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "9436d2b5-d694-433a-b884-d6d2fbaa74a2",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "468838cc-ead3-4a3b-8e36-f3764b7af508",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "5bdcd4fc-a5d6-493f-a238-84c6d8921856"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "5bdcd4fc-a5d6-493f-a238-84c6d8921856",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "5bdcd4fc-a5d6-493f-a238-84c6d8921856",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "a92edafb-ca1b-449a-8bdf-754d7bbbac49",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "177908ca-10ed-4e60-b3fb-83e77ff90146",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "5424c682-6f99-4005-b842-a43f323ef3e0"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "5424c682-6f99-4005-b842-a43f323ef3e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.447 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:13.448 10:17:50 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:13.448 ************************************ 00:10:13.448 START TEST bdev_fio_trim 00:10:13.448 ************************************ 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:13.448 10:17:50 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:13.707 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:13.707 fio-3.35 00:10:13.707 Starting 14 threads 00:10:25.895 00:10:25.895 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=461492: Mon Jul 15 10:18:01 2024 00:10:25.895 write: IOPS=128k, BW=498MiB/s (522MB/s)(4982MiB/10001msec); 0 zone resets 00:10:25.895 slat (usec): min=2, max=218, avg=38.79, stdev=11.07 00:10:25.895 clat (usec): min=24, max=3876, avg=276.13, stdev=92.44 00:10:25.895 lat (usec): min=32, max=3918, avg=314.92, stdev=96.50 00:10:25.895 clat percentiles (usec): 00:10:25.895 | 50.000th=[ 269], 99.000th=[ 486], 99.900th=[ 529], 99.990th=[ 652], 00:10:25.895 | 99.999th=[ 1090] 00:10:25.895 bw ( KiB/s): min=450624, max=704198, per=100.00%, avg=511661.79, stdev=4631.75, samples=266 00:10:25.895 iops : min=112656, max=176048, avg=127915.37, stdev=1157.92, samples=266 00:10:25.895 trim: IOPS=128k, BW=498MiB/s (522MB/s)(4982MiB/10001msec); 0 zone resets 00:10:25.895 slat (usec): min=4, max=3571, avg=26.12, stdev= 7.77 00:10:25.895 clat (usec): min=4, max=3918, avg=310.47, stdev=100.90 00:10:25.895 lat (usec): min=11, max=3946, avg=336.59, stdev=104.03 00:10:25.895 clat percentiles (usec): 00:10:25.895 | 50.000th=[ 302], 99.000th=[ 537], 99.900th=[ 578], 99.990th=[ 627], 00:10:25.895 | 99.999th=[ 881] 00:10:25.895 bw ( KiB/s): min=450624, max=704206, per=100.00%, avg=511662.21, stdev=4631.88, samples=266 00:10:25.895 iops : min=112656, max=176050, avg=127915.37, stdev=1157.95, samples=266 00:10:25.895 lat (usec) : 10=0.01%, 20=0.01%, 50=0.02%, 100=0.75%, 250=36.53% 00:10:25.895 lat (usec) : 500=60.94%, 750=1.74%, 1000=0.01% 00:10:25.895 lat (msec) : 2=0.01%, 4=0.01% 00:10:25.895 cpu : usr=99.59%, sys=0.00%, ctx=560, majf=0, minf=913 00:10:25.895 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:25.895 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:25.895 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:25.895 issued rwts: total=0,1275463,1275466,0 short=0,0,0,0 dropped=0,0,0,0 00:10:25.895 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:25.895 00:10:25.895 Run status group 0 (all jobs): 00:10:25.895 WRITE: bw=498MiB/s (522MB/s), 498MiB/s-498MiB/s (522MB/s-522MB/s), io=4982MiB (5224MB), run=10001-10001msec 00:10:25.895 TRIM: bw=498MiB/s (522MB/s), 498MiB/s-498MiB/s (522MB/s-522MB/s), io=4982MiB (5224MB), run=10001-10001msec 00:10:25.895 00:10:25.895 real 0m11.711s 00:10:25.895 user 2m25.763s 00:10:25.895 sys 0m0.613s 00:10:25.895 10:18:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:25.895 10:18:02 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:10:25.895 ************************************ 00:10:25.895 END TEST bdev_fio_trim 00:10:25.895 ************************************ 00:10:25.895 10:18:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:25.895 10:18:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:10:25.895 10:18:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:25.895 10:18:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:10:25.895 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:25.895 10:18:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:10:25.895 00:10:25.895 real 0m24.379s 00:10:25.895 user 5m11.641s 00:10:25.895 sys 0m2.230s 00:10:25.895 10:18:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:25.895 10:18:02 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:25.895 ************************************ 00:10:25.895 END TEST bdev_fio 00:10:25.895 ************************************ 00:10:25.895 10:18:02 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:25.895 10:18:02 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:25.895 10:18:02 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:25.895 10:18:02 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:25.895 10:18:02 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:25.895 10:18:02 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:25.895 ************************************ 00:10:25.895 START TEST bdev_verify 00:10:25.895 ************************************ 00:10:25.895 10:18:02 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:25.895 [2024-07-15 10:18:02.387277] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:10:25.895 [2024-07-15 10:18:02.387329] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid463128 ] 00:10:25.895 [2024-07-15 10:18:02.499586] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:25.895 [2024-07-15 10:18:02.601456] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:25.895 [2024-07-15 10:18:02.601461] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:25.895 [2024-07-15 10:18:02.754161] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:25.895 [2024-07-15 10:18:02.754216] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:25.895 [2024-07-15 10:18:02.754232] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:25.895 [2024-07-15 10:18:02.762165] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:25.895 [2024-07-15 10:18:02.762192] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:25.895 [2024-07-15 10:18:02.770180] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:25.895 [2024-07-15 10:18:02.770204] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:25.895 [2024-07-15 10:18:02.847549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:25.895 [2024-07-15 10:18:02.847604] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:25.895 [2024-07-15 10:18:02.847623] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x206b4d0 00:10:25.895 [2024-07-15 10:18:02.847636] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:25.895 [2024-07-15 10:18:02.849302] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:25.895 [2024-07-15 10:18:02.849332] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:25.895 Running I/O for 5 seconds... 00:10:32.456 00:10:32.456 Latency(us) 00:10:32.456 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:32.456 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x1000 00:10:32.456 Malloc0 : 5.13 1048.97 4.10 0.00 0.00 121754.08 552.07 426724.84 00:10:32.456 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x1000 length 0x1000 00:10:32.456 Malloc0 : 5.11 1026.71 4.01 0.00 0.00 124390.30 544.95 474138.71 00:10:32.456 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x800 00:10:32.456 Malloc1p0 : 5.13 549.22 2.15 0.00 0.00 231695.76 3490.50 235245.75 00:10:32.456 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x800 length 0x800 00:10:32.456 Malloc1p0 : 5.11 550.66 2.15 0.00 0.00 231129.88 3462.01 235245.75 00:10:32.456 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x800 00:10:32.456 Malloc1p1 : 5.13 548.99 2.14 0.00 0.00 231071.74 3647.22 228863.11 00:10:32.456 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x800 length 0x800 00:10:32.456 Malloc1p1 : 5.12 550.41 2.15 0.00 0.00 230461.61 3647.22 229774.91 00:10:32.456 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p0 : 5.13 548.76 2.14 0.00 0.00 230435.68 3519.00 224304.08 00:10:32.456 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p0 : 5.12 550.17 2.15 0.00 0.00 229836.43 3490.50 225215.89 00:10:32.456 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p1 : 5.13 548.53 2.14 0.00 0.00 229821.28 3647.22 220656.86 00:10:32.456 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p1 : 5.12 549.91 2.15 0.00 0.00 229219.44 3675.71 221568.67 00:10:32.456 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p2 : 5.14 548.30 2.14 0.00 0.00 229165.14 3476.26 217921.45 00:10:32.456 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p2 : 5.27 558.92 2.18 0.00 0.00 224893.89 3490.50 218833.25 00:10:32.456 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p3 : 5.27 558.43 2.18 0.00 0.00 224381.20 3376.53 215186.03 00:10:32.456 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p3 : 5.27 558.68 2.18 0.00 0.00 224279.00 3390.78 216097.84 00:10:32.456 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p4 : 5.28 557.92 2.18 0.00 0.00 223916.92 3561.74 213362.42 00:10:32.456 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p4 : 5.27 558.40 2.18 0.00 0.00 223699.92 3561.74 215186.03 00:10:32.456 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p5 : 5.28 557.42 2.18 0.00 0.00 223436.20 3462.01 211538.81 00:10:32.456 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p5 : 5.28 557.89 2.18 0.00 0.00 223232.43 3462.01 212450.62 00:10:32.456 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p6 : 5.29 556.91 2.18 0.00 0.00 222940.61 3405.02 204244.37 00:10:32.456 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p6 : 5.28 557.40 2.18 0.00 0.00 222723.12 3390.78 205156.17 00:10:32.456 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x200 00:10:32.456 Malloc2p7 : 5.29 556.40 2.17 0.00 0.00 222453.83 3433.52 201508.95 00:10:32.456 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x200 length 0x200 00:10:32.456 Malloc2p7 : 5.29 556.90 2.18 0.00 0.00 222209.24 3419.27 201508.95 00:10:32.456 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x1000 00:10:32.456 TestPT : 5.29 535.11 2.09 0.00 0.00 229230.41 14132.98 201508.95 00:10:32.456 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x1000 length 0x1000 00:10:32.456 TestPT : 5.31 532.18 2.08 0.00 0.00 231031.15 14930.81 273541.57 00:10:32.456 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x2000 00:10:32.456 raid0 : 5.30 555.74 2.17 0.00 0.00 221285.60 3604.48 184184.65 00:10:32.456 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x2000 length 0x2000 00:10:32.456 raid0 : 5.29 556.26 2.17 0.00 0.00 221057.61 3604.48 176890.21 00:10:32.456 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x2000 00:10:32.456 concat0 : 5.30 555.20 2.17 0.00 0.00 220820.50 3547.49 178713.82 00:10:32.456 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x2000 length 0x2000 00:10:32.456 concat0 : 5.30 555.73 2.17 0.00 0.00 220592.80 3561.74 173242.99 00:10:32.456 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x1000 00:10:32.456 raid1 : 5.31 554.71 2.17 0.00 0.00 220331.51 4331.07 181449.24 00:10:32.456 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x1000 length 0x1000 00:10:32.456 raid1 : 5.30 555.18 2.17 0.00 0.00 220117.07 4359.57 177802.02 00:10:32.456 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x0 length 0x4e2 00:10:32.456 AIO0 : 5.31 554.36 2.17 0.00 0.00 219739.81 1780.87 187831.87 00:10:32.456 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:32.456 Verification LBA range: start 0x4e2 length 0x4e2 00:10:32.456 AIO0 : 5.31 554.75 2.17 0.00 0.00 219540.93 1795.12 186008.26 00:10:32.456 =================================================================================================================== 00:10:32.456 Total : 18665.13 72.91 0.00 0.00 213967.30 544.95 474138.71 00:10:32.456 00:10:32.456 real 0m6.504s 00:10:32.456 user 0m12.110s 00:10:32.456 sys 0m0.392s 00:10:32.457 10:18:08 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:32.457 10:18:08 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:10:32.457 ************************************ 00:10:32.457 END TEST bdev_verify 00:10:32.457 ************************************ 00:10:32.457 10:18:08 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:32.457 10:18:08 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:32.457 10:18:08 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:32.457 10:18:08 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:32.457 10:18:08 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:32.457 ************************************ 00:10:32.457 START TEST bdev_verify_big_io 00:10:32.457 ************************************ 00:10:32.457 10:18:08 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:32.457 [2024-07-15 10:18:08.973910] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:10:32.457 [2024-07-15 10:18:08.973983] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid464340 ] 00:10:32.457 [2024-07-15 10:18:09.085174] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:32.457 [2024-07-15 10:18:09.185501] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:32.457 [2024-07-15 10:18:09.185506] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:32.457 [2024-07-15 10:18:09.342224] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:32.457 [2024-07-15 10:18:09.342291] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:32.457 [2024-07-15 10:18:09.342306] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:32.457 [2024-07-15 10:18:09.350225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:32.457 [2024-07-15 10:18:09.350252] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:32.457 [2024-07-15 10:18:09.358237] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:32.457 [2024-07-15 10:18:09.358261] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:32.457 [2024-07-15 10:18:09.435547] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:32.457 [2024-07-15 10:18:09.435602] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:32.457 [2024-07-15 10:18:09.435621] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbcf4d0 00:10:32.457 [2024-07-15 10:18:09.435634] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:32.457 [2024-07-15 10:18:09.437291] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:32.457 [2024-07-15 10:18:09.437321] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:32.457 [2024-07-15 10:18:09.601123] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.602269] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.604085] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.605317] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.607143] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.608343] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.610167] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.611995] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.613032] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.614504] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.615447] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.616921] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.617863] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.619352] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.620290] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.621773] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:32.457 [2024-07-15 10:18:09.645860] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:32.457 [2024-07-15 10:18:09.647886] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:32.716 Running I/O for 5 seconds... 00:10:40.832 00:10:40.833 Latency(us) 00:10:40.833 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:40.833 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x100 00:10:40.833 Malloc0 : 5.88 174.17 10.89 0.00 0.00 720172.91 879.75 1940321.50 00:10:40.833 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x100 length 0x100 00:10:40.833 Malloc0 : 5.97 149.96 9.37 0.00 0.00 837419.58 869.06 2290454.71 00:10:40.833 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x80 00:10:40.833 Malloc1p0 : 6.74 35.59 2.22 0.00 0.00 3237508.33 1474.56 5485420.19 00:10:40.833 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x80 length 0x80 00:10:40.833 Malloc1p0 : 6.27 87.39 5.46 0.00 0.00 1352696.51 2507.46 2698943.44 00:10:40.833 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x80 00:10:40.833 Malloc1p1 : 6.75 35.58 2.22 0.00 0.00 3132572.73 1560.04 5281175.82 00:10:40.833 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x80 length 0x80 00:10:40.833 Malloc1p1 : 6.65 36.08 2.25 0.00 0.00 3133920.70 1609.91 5397886.89 00:10:40.833 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p0 : 6.25 25.59 1.60 0.00 0.00 1115325.87 609.06 2115388.10 00:10:40.833 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p0 : 6.20 23.22 1.45 0.00 0.00 1212685.59 616.18 1984088.15 00:10:40.833 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p1 : 6.25 25.59 1.60 0.00 0.00 1105104.26 630.43 2100799.22 00:10:40.833 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p1 : 6.20 23.22 1.45 0.00 0.00 1202167.08 630.43 1954910.39 00:10:40.833 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p2 : 6.25 25.58 1.60 0.00 0.00 1095838.82 641.11 2057032.57 00:10:40.833 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p2 : 6.20 23.21 1.45 0.00 0.00 1190997.52 651.80 1940321.50 00:10:40.833 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p3 : 6.26 25.57 1.60 0.00 0.00 1085718.90 633.99 2027854.80 00:10:40.833 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p3 : 6.27 25.51 1.59 0.00 0.00 1091705.18 658.92 1911143.74 00:10:40.833 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p4 : 6.26 25.57 1.60 0.00 0.00 1074402.92 637.55 1998677.04 00:10:40.833 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p4 : 6.27 25.50 1.59 0.00 0.00 1082261.60 648.24 1881965.97 00:10:40.833 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p5 : 6.26 25.56 1.60 0.00 0.00 1064714.36 651.80 1969499.27 00:10:40.833 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p5 : 6.28 25.50 1.59 0.00 0.00 1072345.17 658.92 1860082.64 00:10:40.833 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p6 : 6.26 25.56 1.60 0.00 0.00 1055195.35 637.55 1954910.39 00:10:40.833 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p6 : 6.28 25.49 1.59 0.00 0.00 1063029.77 655.36 1830904.88 00:10:40.833 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x20 00:10:40.833 Malloc2p7 : 6.26 25.55 1.60 0.00 0.00 1044794.96 633.99 1925732.62 00:10:40.833 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x20 length 0x20 00:10:40.833 Malloc2p7 : 6.28 25.48 1.59 0.00 0.00 1053342.42 662.48 1809021.55 00:10:40.833 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x100 00:10:40.833 TestPT : 6.82 37.55 2.35 0.00 0.00 2694171.92 1467.44 4872687.08 00:10:40.833 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x100 length 0x100 00:10:40.833 TestPT : 6.72 33.65 2.10 0.00 0.00 3045673.52 86621.50 3486743.15 00:10:40.833 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x200 00:10:40.833 raid0 : 6.57 43.97 2.75 0.00 0.00 2281509.75 1588.54 4697620.48 00:10:40.833 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x200 length 0x200 00:10:40.833 raid0 : 6.75 40.29 2.52 0.00 0.00 2472528.10 1602.78 4785153.78 00:10:40.833 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x200 00:10:40.833 concat0 : 6.75 54.53 3.41 0.00 0.00 1788276.37 1759.50 4522553.88 00:10:40.833 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x200 length 0x200 00:10:40.833 concat0 : 6.59 54.06 3.38 0.00 0.00 1827888.79 1567.17 4610087.18 00:10:40.833 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x100 00:10:40.833 raid1 : 6.86 53.17 3.32 0.00 0.00 1778177.05 1994.57 4347487.28 00:10:40.833 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x100 length 0x100 00:10:40.833 raid1 : 6.75 52.12 3.26 0.00 0.00 1830145.82 2037.31 4405842.81 00:10:40.833 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x0 length 0x4e 00:10:40.833 AIO0 : 6.89 70.99 4.44 0.00 0.00 796681.23 780.02 2801065.63 00:10:40.833 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:40.833 Verification LBA range: start 0x4e length 0x4e 00:10:40.833 AIO0 : 6.87 68.32 4.27 0.00 0.00 835524.62 680.29 2859421.16 00:10:40.833 =================================================================================================================== 00:10:40.833 Total : 1429.10 89.32 0.00 0.00 1461746.08 609.06 5485420.19 00:10:40.833 00:10:40.833 real 0m8.173s 00:10:40.833 user 0m15.391s 00:10:40.833 sys 0m0.437s 00:10:40.833 10:18:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:40.833 10:18:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:40.833 ************************************ 00:10:40.833 END TEST bdev_verify_big_io 00:10:40.833 ************************************ 00:10:40.833 10:18:17 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:40.833 10:18:17 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:40.833 10:18:17 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:40.833 10:18:17 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:40.833 10:18:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:40.833 ************************************ 00:10:40.833 START TEST bdev_write_zeroes 00:10:40.833 ************************************ 00:10:40.833 10:18:17 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:40.833 [2024-07-15 10:18:17.228181] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:10:40.833 [2024-07-15 10:18:17.228242] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465524 ] 00:10:40.833 [2024-07-15 10:18:17.353340] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.833 [2024-07-15 10:18:17.449775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:40.833 [2024-07-15 10:18:17.603438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:40.833 [2024-07-15 10:18:17.603502] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:40.833 [2024-07-15 10:18:17.603518] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:40.833 [2024-07-15 10:18:17.611446] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:40.833 [2024-07-15 10:18:17.611473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:40.833 [2024-07-15 10:18:17.619456] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:40.833 [2024-07-15 10:18:17.619480] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:40.833 [2024-07-15 10:18:17.696623] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:40.833 [2024-07-15 10:18:17.696676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:40.833 [2024-07-15 10:18:17.696695] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x253dc10 00:10:40.833 [2024-07-15 10:18:17.696707] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:40.833 [2024-07-15 10:18:17.698197] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:40.833 [2024-07-15 10:18:17.698225] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:40.833 Running I/O for 1 seconds... 00:10:41.825 00:10:41.825 Latency(us) 00:10:41.825 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:41.825 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc0 : 1.05 5009.52 19.57 0.00 0.00 25542.43 662.48 42854.85 00:10:41.825 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc1p0 : 1.05 5002.43 19.54 0.00 0.00 25531.99 911.81 41943.04 00:10:41.825 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc1p1 : 1.05 4995.38 19.51 0.00 0.00 25511.21 911.81 41031.23 00:10:41.825 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p0 : 1.05 4988.31 19.49 0.00 0.00 25490.16 911.81 40119.43 00:10:41.825 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p1 : 1.05 4981.31 19.46 0.00 0.00 25469.89 908.24 39207.62 00:10:41.825 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p2 : 1.06 4974.35 19.43 0.00 0.00 25451.51 926.05 38295.82 00:10:41.825 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p3 : 1.06 4967.33 19.40 0.00 0.00 25429.63 908.24 37384.01 00:10:41.825 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p4 : 1.06 4960.40 19.38 0.00 0.00 25408.29 897.56 36472.21 00:10:41.825 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p5 : 1.06 4953.49 19.35 0.00 0.00 25393.69 904.68 35788.35 00:10:41.825 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p6 : 1.06 4946.54 19.32 0.00 0.00 25371.37 904.68 34876.55 00:10:41.825 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 Malloc2p7 : 1.06 4939.66 19.30 0.00 0.00 25349.66 904.68 33964.74 00:10:41.825 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 TestPT : 1.06 4932.82 19.27 0.00 0.00 25328.59 940.30 33052.94 00:10:41.825 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 raid0 : 1.07 4924.86 19.24 0.00 0.00 25300.16 1609.91 31457.28 00:10:41.825 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 concat0 : 1.07 4917.11 19.21 0.00 0.00 25250.63 1595.66 29861.62 00:10:41.825 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 raid1 : 1.07 4907.42 19.17 0.00 0.00 25193.25 2550.21 27126.21 00:10:41.825 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:41.825 AIO0 : 1.07 4901.50 19.15 0.00 0.00 25099.83 1047.15 26100.42 00:10:41.825 =================================================================================================================== 00:10:41.825 Total : 79302.42 309.78 0.00 0.00 25382.64 662.48 42854.85 00:10:42.392 00:10:42.392 real 0m2.226s 00:10:42.392 user 0m1.850s 00:10:42.392 sys 0m0.319s 00:10:42.392 10:18:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:42.392 10:18:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:42.392 ************************************ 00:10:42.392 END TEST bdev_write_zeroes 00:10:42.392 ************************************ 00:10:42.392 10:18:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:42.392 10:18:19 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:42.392 10:18:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:42.392 10:18:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:42.392 10:18:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:42.392 ************************************ 00:10:42.392 START TEST bdev_json_nonenclosed 00:10:42.392 ************************************ 00:10:42.392 10:18:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:42.392 [2024-07-15 10:18:19.537783] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:10:42.392 [2024-07-15 10:18:19.537844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465735 ] 00:10:42.650 [2024-07-15 10:18:19.657192] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:42.650 [2024-07-15 10:18:19.758783] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:42.650 [2024-07-15 10:18:19.758852] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:42.650 [2024-07-15 10:18:19.758873] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:42.650 [2024-07-15 10:18:19.758885] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:42.908 00:10:42.908 real 0m0.386s 00:10:42.908 user 0m0.241s 00:10:42.908 sys 0m0.141s 00:10:42.908 10:18:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:10:42.908 10:18:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:42.908 10:18:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:42.908 ************************************ 00:10:42.908 END TEST bdev_json_nonenclosed 00:10:42.908 ************************************ 00:10:42.908 10:18:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:42.908 10:18:19 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:10:42.908 10:18:19 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:42.908 10:18:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:42.908 10:18:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:42.908 10:18:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:42.908 ************************************ 00:10:42.908 START TEST bdev_json_nonarray 00:10:42.908 ************************************ 00:10:42.908 10:18:19 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:42.908 [2024-07-15 10:18:20.002031] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:10:42.908 [2024-07-15 10:18:20.002114] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465926 ] 00:10:43.166 [2024-07-15 10:18:20.146347] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:43.166 [2024-07-15 10:18:20.243765] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:43.166 [2024-07-15 10:18:20.243835] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:10:43.166 [2024-07-15 10:18:20.243856] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:43.166 [2024-07-15 10:18:20.243868] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:43.166 00:10:43.166 real 0m0.406s 00:10:43.166 user 0m0.236s 00:10:43.166 sys 0m0.167s 00:10:43.166 10:18:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:10:43.166 10:18:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:43.166 10:18:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:10:43.166 ************************************ 00:10:43.166 END TEST bdev_json_nonarray 00:10:43.166 ************************************ 00:10:43.423 10:18:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:43.423 10:18:20 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:10:43.423 10:18:20 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:10:43.423 10:18:20 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:10:43.423 10:18:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:43.423 10:18:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:43.423 10:18:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:43.423 ************************************ 00:10:43.423 START TEST bdev_qos 00:10:43.423 ************************************ 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=465950 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 465950' 00:10:43.423 Process qos testing pid: 465950 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 465950 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 465950 ']' 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:43.423 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:43.423 10:18:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:43.423 [2024-07-15 10:18:20.498837] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:10:43.423 [2024-07-15 10:18:20.498910] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid465950 ] 00:10:43.423 [2024-07-15 10:18:20.620609] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:43.688 [2024-07-15 10:18:20.721144] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:44.253 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:44.253 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:10:44.253 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:10:44.253 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.253 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:44.512 Malloc_0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:44.512 [ 00:10:44.512 { 00:10:44.512 "name": "Malloc_0", 00:10:44.512 "aliases": [ 00:10:44.512 "02b7f1ef-5998-405e-bd44-66a286cf3a65" 00:10:44.512 ], 00:10:44.512 "product_name": "Malloc disk", 00:10:44.512 "block_size": 512, 00:10:44.512 "num_blocks": 262144, 00:10:44.512 "uuid": "02b7f1ef-5998-405e-bd44-66a286cf3a65", 00:10:44.512 "assigned_rate_limits": { 00:10:44.512 "rw_ios_per_sec": 0, 00:10:44.512 "rw_mbytes_per_sec": 0, 00:10:44.512 "r_mbytes_per_sec": 0, 00:10:44.512 "w_mbytes_per_sec": 0 00:10:44.512 }, 00:10:44.512 "claimed": false, 00:10:44.512 "zoned": false, 00:10:44.512 "supported_io_types": { 00:10:44.512 "read": true, 00:10:44.512 "write": true, 00:10:44.512 "unmap": true, 00:10:44.512 "flush": true, 00:10:44.512 "reset": true, 00:10:44.512 "nvme_admin": false, 00:10:44.512 "nvme_io": false, 00:10:44.512 "nvme_io_md": false, 00:10:44.512 "write_zeroes": true, 00:10:44.512 "zcopy": true, 00:10:44.512 "get_zone_info": false, 00:10:44.512 "zone_management": false, 00:10:44.512 "zone_append": false, 00:10:44.512 "compare": false, 00:10:44.512 "compare_and_write": false, 00:10:44.512 "abort": true, 00:10:44.512 "seek_hole": false, 00:10:44.512 "seek_data": false, 00:10:44.512 "copy": true, 00:10:44.512 "nvme_iov_md": false 00:10:44.512 }, 00:10:44.512 "memory_domains": [ 00:10:44.512 { 00:10:44.512 "dma_device_id": "system", 00:10:44.512 "dma_device_type": 1 00:10:44.512 }, 00:10:44.512 { 00:10:44.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.512 "dma_device_type": 2 00:10:44.512 } 00:10:44.512 ], 00:10:44.512 "driver_specific": {} 00:10:44.512 } 00:10:44.512 ] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:44.512 Null_1 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:44.512 [ 00:10:44.512 { 00:10:44.512 "name": "Null_1", 00:10:44.512 "aliases": [ 00:10:44.512 "68e9981c-98f0-4d26-ae47-4c79fa9813df" 00:10:44.512 ], 00:10:44.512 "product_name": "Null disk", 00:10:44.512 "block_size": 512, 00:10:44.512 "num_blocks": 262144, 00:10:44.512 "uuid": "68e9981c-98f0-4d26-ae47-4c79fa9813df", 00:10:44.512 "assigned_rate_limits": { 00:10:44.512 "rw_ios_per_sec": 0, 00:10:44.512 "rw_mbytes_per_sec": 0, 00:10:44.512 "r_mbytes_per_sec": 0, 00:10:44.512 "w_mbytes_per_sec": 0 00:10:44.512 }, 00:10:44.512 "claimed": false, 00:10:44.512 "zoned": false, 00:10:44.512 "supported_io_types": { 00:10:44.512 "read": true, 00:10:44.512 "write": true, 00:10:44.512 "unmap": false, 00:10:44.512 "flush": false, 00:10:44.512 "reset": true, 00:10:44.512 "nvme_admin": false, 00:10:44.512 "nvme_io": false, 00:10:44.512 "nvme_io_md": false, 00:10:44.512 "write_zeroes": true, 00:10:44.512 "zcopy": false, 00:10:44.512 "get_zone_info": false, 00:10:44.512 "zone_management": false, 00:10:44.512 "zone_append": false, 00:10:44.512 "compare": false, 00:10:44.512 "compare_and_write": false, 00:10:44.512 "abort": true, 00:10:44.512 "seek_hole": false, 00:10:44.512 "seek_data": false, 00:10:44.512 "copy": false, 00:10:44.512 "nvme_iov_md": false 00:10:44.512 }, 00:10:44.512 "driver_specific": {} 00:10:44.512 } 00:10:44.512 ] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:44.512 10:18:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:10:44.771 Running I/O for 60 seconds... 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 63067.47 252269.88 0.00 0.00 253952.00 0.00 0.00 ' 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=63067.47 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 63067 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=63067 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=15000 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 15000 -gt 1000 ']' 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 15000 Malloc_0 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 15000 IOPS Malloc_0 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:50.038 10:18:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:50.038 ************************************ 00:10:50.038 START TEST bdev_qos_iops 00:10:50.038 ************************************ 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 15000 IOPS Malloc_0 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=15000 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:10:50.038 10:18:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 14999.28 59997.12 0.00 0.00 60720.00 0.00 0.00 ' 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=14999.28 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 14999 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=14999 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:10:55.334 10:18:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=13500 00:10:55.334 10:18:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=16500 00:10:55.334 10:18:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14999 -lt 13500 ']' 00:10:55.334 10:18:32 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14999 -gt 16500 ']' 00:10:55.334 00:10:55.334 real 0m5.242s 00:10:55.334 user 0m0.106s 00:10:55.334 sys 0m0.056s 00:10:55.334 10:18:32 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:55.334 10:18:32 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:55.334 ************************************ 00:10:55.334 END TEST bdev_qos_iops 00:10:55.334 ************************************ 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:10:55.334 10:18:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20170.76 80683.06 0.00 0.00 81920.00 0.00 0.00 ' 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=81920.00 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 81920 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=81920 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:00.601 10:18:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.601 ************************************ 00:11:00.601 START TEST bdev_qos_bw 00:11:00.601 ************************************ 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:11:00.601 10:18:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2048.02 8192.06 0.00 0.00 8396.00 0.00 0.00 ' 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8396.00 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8396 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8396 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8396 -lt 7372 ']' 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8396 -gt 9011 ']' 00:11:05.862 00:11:05.862 real 0m5.293s 00:11:05.862 user 0m0.113s 00:11:05.862 sys 0m0.048s 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:11:05.862 ************************************ 00:11:05.862 END TEST bdev_qos_bw 00:11:05.862 ************************************ 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:05.862 10:18:42 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:05.862 ************************************ 00:11:05.862 START TEST bdev_qos_ro_bw 00:11:05.862 ************************************ 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:05.862 10:18:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.61 2046.46 0.00 0.00 2052.00 0.00 0.00 ' 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2052.00 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2052 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2052 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -lt 1843 ']' 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -gt 2252 ']' 00:11:11.155 00:11:11.155 real 0m5.137s 00:11:11.155 user 0m0.080s 00:11:11.155 sys 0m0.041s 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:11.155 10:18:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:11:11.155 ************************************ 00:11:11.155 END TEST bdev_qos_ro_bw 00:11:11.155 ************************************ 00:11:11.155 10:18:47 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:11.155 10:18:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:11:11.155 10:18:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:11.155 10:18:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:11.413 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:11.413 10:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:11:11.413 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:11.413 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:11.672 00:11:11.672 Latency(us) 00:11:11.672 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:11.672 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:11.672 Malloc_0 : 26.67 20822.55 81.34 0.00 0.00 12179.68 2023.07 503316.48 00:11:11.672 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:11.672 Null_1 : 26.82 20478.49 79.99 0.00 0.00 12468.12 829.89 151359.67 00:11:11.672 =================================================================================================================== 00:11:11.672 Total : 41301.04 161.33 0.00 0.00 12323.10 829.89 503316.48 00:11:11.672 0 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 465950 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 465950 ']' 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 465950 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 465950 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 465950' 00:11:11.672 killing process with pid 465950 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 465950 00:11:11.672 Received shutdown signal, test time was about 26.876159 seconds 00:11:11.672 00:11:11.672 Latency(us) 00:11:11.672 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:11.672 =================================================================================================================== 00:11:11.672 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:11.672 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 465950 00:11:11.930 10:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:11:11.930 00:11:11.930 real 0m28.441s 00:11:11.930 user 0m29.272s 00:11:11.930 sys 0m0.844s 00:11:11.930 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:11.930 10:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:11.930 ************************************ 00:11:11.930 END TEST bdev_qos 00:11:11.930 ************************************ 00:11:11.930 10:18:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:11.930 10:18:48 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:11:11.930 10:18:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:11.930 10:18:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:11.930 10:18:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:11.930 ************************************ 00:11:11.930 START TEST bdev_qd_sampling 00:11:11.930 ************************************ 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=469736 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 469736' 00:11:11.930 Process bdev QD sampling period testing pid: 469736 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 469736 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 469736 ']' 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:11.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:11.930 10:18:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:11.930 [2024-07-15 10:18:49.015631] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:11.930 [2024-07-15 10:18:49.015694] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469736 ] 00:11:12.188 [2024-07-15 10:18:49.146271] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:12.188 [2024-07-15 10:18:49.252050] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:12.188 [2024-07-15 10:18:49.252055] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:12.755 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:12.755 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:11:12.755 10:18:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:11:12.755 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:12.755 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:13.013 Malloc_QD 00:11:13.013 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:13.013 10:18:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:11:13.013 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:11:13.013 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:13.014 10:18:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:13.014 [ 00:11:13.014 { 00:11:13.014 "name": "Malloc_QD", 00:11:13.014 "aliases": [ 00:11:13.014 "0c5858c3-b503-45a5-a6ed-a0dbd9ca3413" 00:11:13.014 ], 00:11:13.014 "product_name": "Malloc disk", 00:11:13.014 "block_size": 512, 00:11:13.014 "num_blocks": 262144, 00:11:13.014 "uuid": "0c5858c3-b503-45a5-a6ed-a0dbd9ca3413", 00:11:13.014 "assigned_rate_limits": { 00:11:13.014 "rw_ios_per_sec": 0, 00:11:13.014 "rw_mbytes_per_sec": 0, 00:11:13.014 "r_mbytes_per_sec": 0, 00:11:13.014 "w_mbytes_per_sec": 0 00:11:13.014 }, 00:11:13.014 "claimed": false, 00:11:13.014 "zoned": false, 00:11:13.014 "supported_io_types": { 00:11:13.014 "read": true, 00:11:13.014 "write": true, 00:11:13.014 "unmap": true, 00:11:13.014 "flush": true, 00:11:13.014 "reset": true, 00:11:13.014 "nvme_admin": false, 00:11:13.014 "nvme_io": false, 00:11:13.014 "nvme_io_md": false, 00:11:13.014 "write_zeroes": true, 00:11:13.014 "zcopy": true, 00:11:13.014 "get_zone_info": false, 00:11:13.014 "zone_management": false, 00:11:13.014 "zone_append": false, 00:11:13.014 "compare": false, 00:11:13.014 "compare_and_write": false, 00:11:13.014 "abort": true, 00:11:13.014 "seek_hole": false, 00:11:13.014 "seek_data": false, 00:11:13.014 "copy": true, 00:11:13.014 "nvme_iov_md": false 00:11:13.014 }, 00:11:13.014 "memory_domains": [ 00:11:13.014 { 00:11:13.014 "dma_device_id": "system", 00:11:13.014 "dma_device_type": 1 00:11:13.014 }, 00:11:13.014 { 00:11:13.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.014 "dma_device_type": 2 00:11:13.014 } 00:11:13.014 ], 00:11:13.014 "driver_specific": {} 00:11:13.014 } 00:11:13.014 ] 00:11:13.014 10:18:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:13.014 10:18:50 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:11:13.014 10:18:50 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:11:13.014 10:18:50 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:13.014 Running I/O for 5 seconds... 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:11:14.915 "tick_rate": 2300000000, 00:11:14.915 "ticks": 5333474145395270, 00:11:14.915 "bdevs": [ 00:11:14.915 { 00:11:14.915 "name": "Malloc_QD", 00:11:14.915 "bytes_read": 777040384, 00:11:14.915 "num_read_ops": 189700, 00:11:14.915 "bytes_written": 0, 00:11:14.915 "num_write_ops": 0, 00:11:14.915 "bytes_unmapped": 0, 00:11:14.915 "num_unmap_ops": 0, 00:11:14.915 "bytes_copied": 0, 00:11:14.915 "num_copy_ops": 0, 00:11:14.915 "read_latency_ticks": 2241639838644, 00:11:14.915 "max_read_latency_ticks": 14814850, 00:11:14.915 "min_read_latency_ticks": 306898, 00:11:14.915 "write_latency_ticks": 0, 00:11:14.915 "max_write_latency_ticks": 0, 00:11:14.915 "min_write_latency_ticks": 0, 00:11:14.915 "unmap_latency_ticks": 0, 00:11:14.915 "max_unmap_latency_ticks": 0, 00:11:14.915 "min_unmap_latency_ticks": 0, 00:11:14.915 "copy_latency_ticks": 0, 00:11:14.915 "max_copy_latency_ticks": 0, 00:11:14.915 "min_copy_latency_ticks": 0, 00:11:14.915 "io_error": {}, 00:11:14.915 "queue_depth_polling_period": 10, 00:11:14.915 "queue_depth": 512, 00:11:14.915 "io_time": 30, 00:11:14.915 "weighted_io_time": 15360 00:11:14.915 } 00:11:14.915 ] 00:11:14.915 }' 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:14.915 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:15.174 00:11:15.174 Latency(us) 00:11:15.174 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:15.174 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:15.174 Malloc_QD : 1.98 49272.49 192.47 0.00 0.00 5182.57 1467.44 5527.82 00:11:15.174 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:15.174 Malloc_QD : 1.99 50261.43 196.33 0.00 0.00 5081.30 947.42 6468.12 00:11:15.174 =================================================================================================================== 00:11:15.174 Total : 99533.92 388.80 0.00 0.00 5131.41 947.42 6468.12 00:11:15.174 0 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 469736 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 469736 ']' 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 469736 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 469736 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 469736' 00:11:15.174 killing process with pid 469736 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 469736 00:11:15.174 Received shutdown signal, test time was about 2.067804 seconds 00:11:15.174 00:11:15.174 Latency(us) 00:11:15.174 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:15.174 =================================================================================================================== 00:11:15.174 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:15.174 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 469736 00:11:15.433 10:18:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:11:15.433 00:11:15.433 real 0m3.467s 00:11:15.433 user 0m6.773s 00:11:15.433 sys 0m0.434s 00:11:15.433 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:15.433 10:18:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:15.433 ************************************ 00:11:15.433 END TEST bdev_qd_sampling 00:11:15.433 ************************************ 00:11:15.433 10:18:52 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:15.433 10:18:52 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:11:15.433 10:18:52 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:15.433 10:18:52 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:15.433 10:18:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:15.433 ************************************ 00:11:15.433 START TEST bdev_error 00:11:15.433 ************************************ 00:11:15.433 10:18:52 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:11:15.433 10:18:52 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:11:15.433 10:18:52 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:11:15.433 10:18:52 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:11:15.433 10:18:52 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=470281 00:11:15.433 10:18:52 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 470281' 00:11:15.433 Process error testing pid: 470281 00:11:15.433 10:18:52 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:11:15.433 10:18:52 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 470281 00:11:15.433 10:18:52 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 470281 ']' 00:11:15.433 10:18:52 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:15.433 10:18:52 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:15.433 10:18:52 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:15.433 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:15.433 10:18:52 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:15.433 10:18:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:15.433 [2024-07-15 10:18:52.574677] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:15.433 [2024-07-15 10:18:52.574748] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470281 ] 00:11:15.691 [2024-07-15 10:18:52.693953] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:15.691 [2024-07-15 10:18:52.800155] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:16.623 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.623 Dev_1 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.623 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.623 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.623 [ 00:11:16.623 { 00:11:16.623 "name": "Dev_1", 00:11:16.623 "aliases": [ 00:11:16.623 "dde7e456-61ab-4e14-aa22-208a35106d2a" 00:11:16.623 ], 00:11:16.623 "product_name": "Malloc disk", 00:11:16.623 "block_size": 512, 00:11:16.623 "num_blocks": 262144, 00:11:16.623 "uuid": "dde7e456-61ab-4e14-aa22-208a35106d2a", 00:11:16.623 "assigned_rate_limits": { 00:11:16.624 "rw_ios_per_sec": 0, 00:11:16.624 "rw_mbytes_per_sec": 0, 00:11:16.624 "r_mbytes_per_sec": 0, 00:11:16.624 "w_mbytes_per_sec": 0 00:11:16.624 }, 00:11:16.624 "claimed": false, 00:11:16.624 "zoned": false, 00:11:16.624 "supported_io_types": { 00:11:16.624 "read": true, 00:11:16.624 "write": true, 00:11:16.624 "unmap": true, 00:11:16.624 "flush": true, 00:11:16.624 "reset": true, 00:11:16.624 "nvme_admin": false, 00:11:16.624 "nvme_io": false, 00:11:16.624 "nvme_io_md": false, 00:11:16.624 "write_zeroes": true, 00:11:16.624 "zcopy": true, 00:11:16.624 "get_zone_info": false, 00:11:16.624 "zone_management": false, 00:11:16.624 "zone_append": false, 00:11:16.624 "compare": false, 00:11:16.624 "compare_and_write": false, 00:11:16.624 "abort": true, 00:11:16.624 "seek_hole": false, 00:11:16.624 "seek_data": false, 00:11:16.624 "copy": true, 00:11:16.624 "nvme_iov_md": false 00:11:16.624 }, 00:11:16.624 "memory_domains": [ 00:11:16.624 { 00:11:16.624 "dma_device_id": "system", 00:11:16.624 "dma_device_type": 1 00:11:16.624 }, 00:11:16.624 { 00:11:16.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.624 "dma_device_type": 2 00:11:16.624 } 00:11:16.624 ], 00:11:16.624 "driver_specific": {} 00:11:16.624 } 00:11:16.624 ] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:16.624 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.624 true 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.624 Dev_2 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.624 [ 00:11:16.624 { 00:11:16.624 "name": "Dev_2", 00:11:16.624 "aliases": [ 00:11:16.624 "93dfadab-5a42-4134-b073-63938f8e9415" 00:11:16.624 ], 00:11:16.624 "product_name": "Malloc disk", 00:11:16.624 "block_size": 512, 00:11:16.624 "num_blocks": 262144, 00:11:16.624 "uuid": "93dfadab-5a42-4134-b073-63938f8e9415", 00:11:16.624 "assigned_rate_limits": { 00:11:16.624 "rw_ios_per_sec": 0, 00:11:16.624 "rw_mbytes_per_sec": 0, 00:11:16.624 "r_mbytes_per_sec": 0, 00:11:16.624 "w_mbytes_per_sec": 0 00:11:16.624 }, 00:11:16.624 "claimed": false, 00:11:16.624 "zoned": false, 00:11:16.624 "supported_io_types": { 00:11:16.624 "read": true, 00:11:16.624 "write": true, 00:11:16.624 "unmap": true, 00:11:16.624 "flush": true, 00:11:16.624 "reset": true, 00:11:16.624 "nvme_admin": false, 00:11:16.624 "nvme_io": false, 00:11:16.624 "nvme_io_md": false, 00:11:16.624 "write_zeroes": true, 00:11:16.624 "zcopy": true, 00:11:16.624 "get_zone_info": false, 00:11:16.624 "zone_management": false, 00:11:16.624 "zone_append": false, 00:11:16.624 "compare": false, 00:11:16.624 "compare_and_write": false, 00:11:16.624 "abort": true, 00:11:16.624 "seek_hole": false, 00:11:16.624 "seek_data": false, 00:11:16.624 "copy": true, 00:11:16.624 "nvme_iov_md": false 00:11:16.624 }, 00:11:16.624 "memory_domains": [ 00:11:16.624 { 00:11:16.624 "dma_device_id": "system", 00:11:16.624 "dma_device_type": 1 00:11:16.624 }, 00:11:16.624 { 00:11:16.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:16.624 "dma_device_type": 2 00:11:16.624 } 00:11:16.624 ], 00:11:16.624 "driver_specific": {} 00:11:16.624 } 00:11:16.624 ] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:16.624 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:16.624 10:18:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.624 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:11:16.624 10:18:53 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:16.624 Running I/O for 5 seconds... 00:11:17.556 10:18:54 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 470281 00:11:17.556 10:18:54 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 470281' 00:11:17.556 Process is existed as continue on error is set. Pid: 470281 00:11:17.556 10:18:54 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:11:17.556 10:18:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:17.556 10:18:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:17.556 10:18:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:17.556 10:18:54 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:11:17.556 10:18:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:17.556 10:18:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:17.556 10:18:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:17.556 10:18:54 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:11:17.814 Timeout while waiting for response: 00:11:17.814 00:11:17.814 00:11:22.007 00:11:22.007 Latency(us) 00:11:22.007 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:22.007 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:22.007 EE_Dev_1 : 0.90 37657.27 147.10 5.58 0.00 421.30 130.00 662.48 00:11:22.007 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:22.007 Dev_2 : 5.00 81721.39 319.22 0.00 0.00 192.28 65.00 22339.23 00:11:22.007 =================================================================================================================== 00:11:22.007 Total : 119378.67 466.32 5.58 0.00 209.76 65.00 22339.23 00:11:22.575 10:18:59 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 470281 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 470281 ']' 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 470281 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 470281 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 470281' 00:11:22.575 killing process with pid 470281 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 470281 00:11:22.575 Received shutdown signal, test time was about 5.000000 seconds 00:11:22.575 00:11:22.575 Latency(us) 00:11:22.575 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:22.575 =================================================================================================================== 00:11:22.575 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:22.575 10:18:59 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 470281 00:11:23.142 10:19:00 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=471182 00:11:23.142 10:19:00 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 471182' 00:11:23.142 Process error testing pid: 471182 00:11:23.142 10:19:00 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:11:23.142 10:19:00 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 471182 00:11:23.142 10:19:00 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 471182 ']' 00:11:23.142 10:19:00 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:23.142 10:19:00 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:23.142 10:19:00 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:23.142 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:23.142 10:19:00 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:23.142 10:19:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:23.142 [2024-07-15 10:19:00.098085] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:23.142 [2024-07-15 10:19:00.098156] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471182 ] 00:11:23.142 [2024-07-15 10:19:00.215096] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:23.142 [2024-07-15 10:19:00.312373] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:24.077 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.077 Dev_1 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.077 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.077 [ 00:11:24.077 { 00:11:24.077 "name": "Dev_1", 00:11:24.077 "aliases": [ 00:11:24.077 "88baba76-377a-4bb1-a007-c1b15a372133" 00:11:24.077 ], 00:11:24.077 "product_name": "Malloc disk", 00:11:24.077 "block_size": 512, 00:11:24.077 "num_blocks": 262144, 00:11:24.077 "uuid": "88baba76-377a-4bb1-a007-c1b15a372133", 00:11:24.077 "assigned_rate_limits": { 00:11:24.077 "rw_ios_per_sec": 0, 00:11:24.077 "rw_mbytes_per_sec": 0, 00:11:24.077 "r_mbytes_per_sec": 0, 00:11:24.077 "w_mbytes_per_sec": 0 00:11:24.077 }, 00:11:24.077 "claimed": false, 00:11:24.077 "zoned": false, 00:11:24.077 "supported_io_types": { 00:11:24.077 "read": true, 00:11:24.077 "write": true, 00:11:24.077 "unmap": true, 00:11:24.077 "flush": true, 00:11:24.077 "reset": true, 00:11:24.077 "nvme_admin": false, 00:11:24.077 "nvme_io": false, 00:11:24.077 "nvme_io_md": false, 00:11:24.077 "write_zeroes": true, 00:11:24.077 "zcopy": true, 00:11:24.077 "get_zone_info": false, 00:11:24.077 "zone_management": false, 00:11:24.077 "zone_append": false, 00:11:24.077 "compare": false, 00:11:24.077 "compare_and_write": false, 00:11:24.077 "abort": true, 00:11:24.077 "seek_hole": false, 00:11:24.077 "seek_data": false, 00:11:24.077 "copy": true, 00:11:24.077 "nvme_iov_md": false 00:11:24.077 }, 00:11:24.077 "memory_domains": [ 00:11:24.077 { 00:11:24.077 "dma_device_id": "system", 00:11:24.077 "dma_device_type": 1 00:11:24.077 }, 00:11:24.077 { 00:11:24.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.077 "dma_device_type": 2 00:11:24.077 } 00:11:24.077 ], 00:11:24.077 "driver_specific": {} 00:11:24.077 } 00:11:24.077 ] 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.077 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:24.078 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.078 true 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.078 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.078 Dev_2 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.078 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.078 [ 00:11:24.078 { 00:11:24.078 "name": "Dev_2", 00:11:24.078 "aliases": [ 00:11:24.078 "67a2ad07-ddd4-459d-a191-444ecf741f5c" 00:11:24.078 ], 00:11:24.078 "product_name": "Malloc disk", 00:11:24.078 "block_size": 512, 00:11:24.078 "num_blocks": 262144, 00:11:24.078 "uuid": "67a2ad07-ddd4-459d-a191-444ecf741f5c", 00:11:24.078 "assigned_rate_limits": { 00:11:24.078 "rw_ios_per_sec": 0, 00:11:24.078 "rw_mbytes_per_sec": 0, 00:11:24.078 "r_mbytes_per_sec": 0, 00:11:24.078 "w_mbytes_per_sec": 0 00:11:24.078 }, 00:11:24.078 "claimed": false, 00:11:24.078 "zoned": false, 00:11:24.078 "supported_io_types": { 00:11:24.078 "read": true, 00:11:24.078 "write": true, 00:11:24.078 "unmap": true, 00:11:24.078 "flush": true, 00:11:24.078 "reset": true, 00:11:24.078 "nvme_admin": false, 00:11:24.078 "nvme_io": false, 00:11:24.078 "nvme_io_md": false, 00:11:24.078 "write_zeroes": true, 00:11:24.078 "zcopy": true, 00:11:24.078 "get_zone_info": false, 00:11:24.078 "zone_management": false, 00:11:24.078 "zone_append": false, 00:11:24.078 "compare": false, 00:11:24.078 "compare_and_write": false, 00:11:24.078 "abort": true, 00:11:24.078 "seek_hole": false, 00:11:24.078 "seek_data": false, 00:11:24.078 "copy": true, 00:11:24.078 "nvme_iov_md": false 00:11:24.078 }, 00:11:24.078 "memory_domains": [ 00:11:24.078 { 00:11:24.078 "dma_device_id": "system", 00:11:24.078 "dma_device_type": 1 00:11:24.078 }, 00:11:24.078 { 00:11:24.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.078 "dma_device_type": 2 00:11:24.078 } 00:11:24.078 ], 00:11:24.078 "driver_specific": {} 00:11:24.078 } 00:11:24.078 ] 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:24.078 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.078 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 471182 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 471182 00:11:24.078 10:19:01 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:24.078 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 471182 00:11:24.337 Running I/O for 5 seconds... 00:11:24.337 task offset: 254448 on job bdev=EE_Dev_1 fails 00:11:24.337 00:11:24.337 Latency(us) 00:11:24.337 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:24.337 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:24.337 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:11:24.337 EE_Dev_1 : 0.00 29972.75 117.08 6811.99 0.00 361.60 138.02 644.67 00:11:24.337 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:24.337 Dev_2 : 0.00 18317.12 71.55 0.00 0.00 650.96 126.44 1210.99 00:11:24.337 =================================================================================================================== 00:11:24.337 Total : 48289.87 188.63 6811.99 0.00 518.54 126.44 1210.99 00:11:24.337 [2024-07-15 10:19:01.321516] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:24.337 request: 00:11:24.337 { 00:11:24.337 "method": "perform_tests", 00:11:24.337 "req_id": 1 00:11:24.337 } 00:11:24.337 Got JSON-RPC error response 00:11:24.337 response: 00:11:24.337 { 00:11:24.337 "code": -32603, 00:11:24.337 "message": "bdevperf failed with error Operation not permitted" 00:11:24.337 } 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:24.596 00:11:24.596 real 0m9.074s 00:11:24.596 user 0m9.506s 00:11:24.596 sys 0m0.865s 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:24.596 10:19:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:24.596 ************************************ 00:11:24.596 END TEST bdev_error 00:11:24.596 ************************************ 00:11:24.596 10:19:01 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:24.596 10:19:01 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:11:24.596 10:19:01 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:24.596 10:19:01 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:24.596 10:19:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:24.596 ************************************ 00:11:24.596 START TEST bdev_stat 00:11:24.596 ************************************ 00:11:24.596 10:19:01 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:11:24.596 10:19:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:11:24.596 10:19:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=471388 00:11:24.596 10:19:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 471388' 00:11:24.597 Process Bdev IO statistics testing pid: 471388 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 471388 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 471388 ']' 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:24.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:24.597 10:19:01 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:24.597 [2024-07-15 10:19:01.775683] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:24.597 [2024-07-15 10:19:01.775829] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471388 ] 00:11:24.855 [2024-07-15 10:19:01.971959] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:25.114 [2024-07-15 10:19:02.075607] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:25.114 [2024-07-15 10:19:02.075613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:25.681 Malloc_STAT 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:25.681 [ 00:11:25.681 { 00:11:25.681 "name": "Malloc_STAT", 00:11:25.681 "aliases": [ 00:11:25.681 "ce3a4c72-f02c-46db-b107-9fdad44a522a" 00:11:25.681 ], 00:11:25.681 "product_name": "Malloc disk", 00:11:25.681 "block_size": 512, 00:11:25.681 "num_blocks": 262144, 00:11:25.681 "uuid": "ce3a4c72-f02c-46db-b107-9fdad44a522a", 00:11:25.681 "assigned_rate_limits": { 00:11:25.681 "rw_ios_per_sec": 0, 00:11:25.681 "rw_mbytes_per_sec": 0, 00:11:25.681 "r_mbytes_per_sec": 0, 00:11:25.681 "w_mbytes_per_sec": 0 00:11:25.681 }, 00:11:25.681 "claimed": false, 00:11:25.681 "zoned": false, 00:11:25.681 "supported_io_types": { 00:11:25.681 "read": true, 00:11:25.681 "write": true, 00:11:25.681 "unmap": true, 00:11:25.681 "flush": true, 00:11:25.681 "reset": true, 00:11:25.681 "nvme_admin": false, 00:11:25.681 "nvme_io": false, 00:11:25.681 "nvme_io_md": false, 00:11:25.681 "write_zeroes": true, 00:11:25.681 "zcopy": true, 00:11:25.681 "get_zone_info": false, 00:11:25.681 "zone_management": false, 00:11:25.681 "zone_append": false, 00:11:25.681 "compare": false, 00:11:25.681 "compare_and_write": false, 00:11:25.681 "abort": true, 00:11:25.681 "seek_hole": false, 00:11:25.681 "seek_data": false, 00:11:25.681 "copy": true, 00:11:25.681 "nvme_iov_md": false 00:11:25.681 }, 00:11:25.681 "memory_domains": [ 00:11:25.681 { 00:11:25.681 "dma_device_id": "system", 00:11:25.681 "dma_device_type": 1 00:11:25.681 }, 00:11:25.681 { 00:11:25.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:25.681 "dma_device_type": 2 00:11:25.681 } 00:11:25.681 ], 00:11:25.681 "driver_specific": {} 00:11:25.681 } 00:11:25.681 ] 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:11:25.681 10:19:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:25.681 Running I/O for 10 seconds... 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:11:27.585 "tick_rate": 2300000000, 00:11:27.585 "ticks": 5333503324383612, 00:11:27.585 "bdevs": [ 00:11:27.585 { 00:11:27.585 "name": "Malloc_STAT", 00:11:27.585 "bytes_read": 772846080, 00:11:27.585 "num_read_ops": 188676, 00:11:27.585 "bytes_written": 0, 00:11:27.585 "num_write_ops": 0, 00:11:27.585 "bytes_unmapped": 0, 00:11:27.585 "num_unmap_ops": 0, 00:11:27.585 "bytes_copied": 0, 00:11:27.585 "num_copy_ops": 0, 00:11:27.585 "read_latency_ticks": 2232726430240, 00:11:27.585 "max_read_latency_ticks": 14324730, 00:11:27.585 "min_read_latency_ticks": 250156, 00:11:27.585 "write_latency_ticks": 0, 00:11:27.585 "max_write_latency_ticks": 0, 00:11:27.585 "min_write_latency_ticks": 0, 00:11:27.585 "unmap_latency_ticks": 0, 00:11:27.585 "max_unmap_latency_ticks": 0, 00:11:27.585 "min_unmap_latency_ticks": 0, 00:11:27.585 "copy_latency_ticks": 0, 00:11:27.585 "max_copy_latency_ticks": 0, 00:11:27.585 "min_copy_latency_ticks": 0, 00:11:27.585 "io_error": {} 00:11:27.585 } 00:11:27.585 ] 00:11:27.585 }' 00:11:27.585 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=188676 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:11:27.845 "tick_rate": 2300000000, 00:11:27.845 "ticks": 5333503482540400, 00:11:27.845 "name": "Malloc_STAT", 00:11:27.845 "channels": [ 00:11:27.845 { 00:11:27.845 "thread_id": 2, 00:11:27.845 "bytes_read": 395313152, 00:11:27.845 "num_read_ops": 96512, 00:11:27.845 "bytes_written": 0, 00:11:27.845 "num_write_ops": 0, 00:11:27.845 "bytes_unmapped": 0, 00:11:27.845 "num_unmap_ops": 0, 00:11:27.845 "bytes_copied": 0, 00:11:27.845 "num_copy_ops": 0, 00:11:27.845 "read_latency_ticks": 1155824210808, 00:11:27.845 "max_read_latency_ticks": 12844162, 00:11:27.845 "min_read_latency_ticks": 7584478, 00:11:27.845 "write_latency_ticks": 0, 00:11:27.845 "max_write_latency_ticks": 0, 00:11:27.845 "min_write_latency_ticks": 0, 00:11:27.845 "unmap_latency_ticks": 0, 00:11:27.845 "max_unmap_latency_ticks": 0, 00:11:27.845 "min_unmap_latency_ticks": 0, 00:11:27.845 "copy_latency_ticks": 0, 00:11:27.845 "max_copy_latency_ticks": 0, 00:11:27.845 "min_copy_latency_ticks": 0 00:11:27.845 }, 00:11:27.845 { 00:11:27.845 "thread_id": 3, 00:11:27.845 "bytes_read": 404750336, 00:11:27.845 "num_read_ops": 98816, 00:11:27.845 "bytes_written": 0, 00:11:27.845 "num_write_ops": 0, 00:11:27.845 "bytes_unmapped": 0, 00:11:27.845 "num_unmap_ops": 0, 00:11:27.845 "bytes_copied": 0, 00:11:27.845 "num_copy_ops": 0, 00:11:27.845 "read_latency_ticks": 1156379950314, 00:11:27.845 "max_read_latency_ticks": 14324730, 00:11:27.845 "min_read_latency_ticks": 7592566, 00:11:27.845 "write_latency_ticks": 0, 00:11:27.845 "max_write_latency_ticks": 0, 00:11:27.845 "min_write_latency_ticks": 0, 00:11:27.845 "unmap_latency_ticks": 0, 00:11:27.845 "max_unmap_latency_ticks": 0, 00:11:27.845 "min_unmap_latency_ticks": 0, 00:11:27.845 "copy_latency_ticks": 0, 00:11:27.845 "max_copy_latency_ticks": 0, 00:11:27.845 "min_copy_latency_ticks": 0 00:11:27.845 } 00:11:27.845 ] 00:11:27.845 }' 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=96512 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=96512 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=98816 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=195328 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:11:27.845 "tick_rate": 2300000000, 00:11:27.845 "ticks": 5333503770205002, 00:11:27.845 "bdevs": [ 00:11:27.845 { 00:11:27.845 "name": "Malloc_STAT", 00:11:27.845 "bytes_read": 851489280, 00:11:27.845 "num_read_ops": 207876, 00:11:27.845 "bytes_written": 0, 00:11:27.845 "num_write_ops": 0, 00:11:27.845 "bytes_unmapped": 0, 00:11:27.845 "num_unmap_ops": 0, 00:11:27.845 "bytes_copied": 0, 00:11:27.845 "num_copy_ops": 0, 00:11:27.845 "read_latency_ticks": 2460393464044, 00:11:27.845 "max_read_latency_ticks": 14324730, 00:11:27.845 "min_read_latency_ticks": 250156, 00:11:27.845 "write_latency_ticks": 0, 00:11:27.845 "max_write_latency_ticks": 0, 00:11:27.845 "min_write_latency_ticks": 0, 00:11:27.845 "unmap_latency_ticks": 0, 00:11:27.845 "max_unmap_latency_ticks": 0, 00:11:27.845 "min_unmap_latency_ticks": 0, 00:11:27.845 "copy_latency_ticks": 0, 00:11:27.845 "max_copy_latency_ticks": 0, 00:11:27.845 "min_copy_latency_ticks": 0, 00:11:27.845 "io_error": {} 00:11:27.845 } 00:11:27.845 ] 00:11:27.845 }' 00:11:27.845 10:19:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:11:27.845 10:19:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=207876 00:11:27.845 10:19:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 195328 -lt 188676 ']' 00:11:27.845 10:19:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 195328 -gt 207876 ']' 00:11:27.845 10:19:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:11:27.845 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:27.845 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:27.845 00:11:27.845 Latency(us) 00:11:27.845 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:27.845 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:27.845 Malloc_STAT : 2.17 49087.27 191.75 0.00 0.00 5202.54 1652.65 5584.81 00:11:27.845 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:27.845 Malloc_STAT : 2.17 50257.15 196.32 0.00 0.00 5082.11 1396.20 6240.17 00:11:27.845 =================================================================================================================== 00:11:27.845 Total : 99344.41 388.06 0.00 0.00 5141.61 1396.20 6240.17 00:11:28.107 0 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 471388 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 471388 ']' 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 471388 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 471388 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 471388' 00:11:28.107 killing process with pid 471388 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 471388 00:11:28.107 Received shutdown signal, test time was about 2.253454 seconds 00:11:28.107 00:11:28.107 Latency(us) 00:11:28.107 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:28.107 =================================================================================================================== 00:11:28.107 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:28.107 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 471388 00:11:28.401 10:19:05 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:11:28.401 00:11:28.401 real 0m3.663s 00:11:28.401 user 0m7.134s 00:11:28.401 sys 0m0.561s 00:11:28.401 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:28.401 10:19:05 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:28.401 ************************************ 00:11:28.401 END TEST bdev_stat 00:11:28.401 ************************************ 00:11:28.401 10:19:05 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:11:28.401 10:19:05 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:11:28.401 00:11:28.401 real 1m57.402s 00:11:28.401 user 7m12.456s 00:11:28.401 sys 0m22.953s 00:11:28.401 10:19:05 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:28.401 10:19:05 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:28.401 ************************************ 00:11:28.401 END TEST blockdev_general 00:11:28.401 ************************************ 00:11:28.401 10:19:05 -- common/autotest_common.sh@1142 -- # return 0 00:11:28.401 10:19:05 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:28.401 10:19:05 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:28.401 10:19:05 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:28.401 10:19:05 -- common/autotest_common.sh@10 -- # set +x 00:11:28.401 ************************************ 00:11:28.401 START TEST bdev_raid 00:11:28.401 ************************************ 00:11:28.401 10:19:05 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:28.401 * Looking for test storage... 00:11:28.401 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:11:28.401 10:19:05 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:11:28.401 10:19:05 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:11:28.401 10:19:05 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:11:28.401 10:19:05 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:11:28.401 10:19:05 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:11:28.401 10:19:05 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:11:28.660 10:19:05 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:11:28.660 10:19:05 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:11:28.660 10:19:05 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:11:28.660 10:19:05 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:11:28.660 10:19:05 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:11:28.660 10:19:05 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:11:28.660 10:19:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:28.660 10:19:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:28.660 10:19:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:28.660 ************************************ 00:11:28.660 START TEST raid_function_test_raid0 00:11:28.660 ************************************ 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=471999 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 471999' 00:11:28.660 Process raid pid: 471999 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 471999 /var/tmp/spdk-raid.sock 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 471999 ']' 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:28.660 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:28.660 10:19:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:28.660 [2024-07-15 10:19:05.709701] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:28.660 [2024-07-15 10:19:05.709768] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:28.660 [2024-07-15 10:19:05.840129] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:28.920 [2024-07-15 10:19:05.944854] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:28.920 [2024-07-15 10:19:06.009858] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:28.920 [2024-07-15 10:19:06.009894] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:29.487 10:19:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:29.487 10:19:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:11:29.487 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:11:29.487 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:11:29.487 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:29.487 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:11:29.487 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:29.746 [2024-07-15 10:19:06.907597] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:29.746 [2024-07-15 10:19:06.909053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:29.746 [2024-07-15 10:19:06.909112] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x258abd0 00:11:29.746 [2024-07-15 10:19:06.909124] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:29.746 [2024-07-15 10:19:06.909313] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x258ab10 00:11:29.746 [2024-07-15 10:19:06.909436] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x258abd0 00:11:29.746 [2024-07-15 10:19:06.909446] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x258abd0 00:11:29.746 [2024-07-15 10:19:06.909548] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:29.746 Base_1 00:11:29.746 Base_2 00:11:29.746 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:29.746 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:29.746 10:19:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:30.005 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:30.265 [2024-07-15 10:19:07.412954] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x273e8e0 00:11:30.265 /dev/nbd0 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:30.265 1+0 records in 00:11:30.265 1+0 records out 00:11:30.265 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258852 s, 15.8 MB/s 00:11:30.265 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:30.523 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:30.524 { 00:11:30.524 "nbd_device": "/dev/nbd0", 00:11:30.524 "bdev_name": "raid" 00:11:30.524 } 00:11:30.524 ]' 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:30.524 { 00:11:30.524 "nbd_device": "/dev/nbd0", 00:11:30.524 "bdev_name": "raid" 00:11:30.524 } 00:11:30.524 ]' 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:30.524 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:30.783 4096+0 records in 00:11:30.783 4096+0 records out 00:11:30.783 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0284792 s, 73.6 MB/s 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:30.783 4096+0 records in 00:11:30.783 4096+0 records out 00:11:30.783 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.204158 s, 10.3 MB/s 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:30.783 128+0 records in 00:11:30.783 128+0 records out 00:11:30.783 65536 bytes (66 kB, 64 KiB) copied, 0.000829213 s, 79.0 MB/s 00:11:30.783 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:31.042 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:31.042 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:31.042 10:19:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:31.042 2035+0 records in 00:11:31.042 2035+0 records out 00:11:31.042 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.010824 s, 96.3 MB/s 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:31.042 456+0 records in 00:11:31.042 456+0 records out 00:11:31.042 233472 bytes (233 kB, 228 KiB) copied, 0.00278519 s, 83.8 MB/s 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:31.042 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:31.301 [2024-07-15 10:19:08.333126] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:31.301 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:31.301 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:31.302 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 471999 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 471999 ']' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 471999 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 471999 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 471999' 00:11:31.561 killing process with pid 471999 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 471999 00:11:31.561 [2024-07-15 10:19:08.693590] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:31.561 [2024-07-15 10:19:08.693665] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:31.561 [2024-07-15 10:19:08.693706] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:31.561 [2024-07-15 10:19:08.693721] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258abd0 name raid, state offline 00:11:31.561 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 471999 00:11:31.561 [2024-07-15 10:19:08.710390] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:31.820 10:19:08 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:31.820 00:11:31.820 real 0m3.261s 00:11:31.820 user 0m4.364s 00:11:31.820 sys 0m1.193s 00:11:31.820 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:31.820 10:19:08 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:31.820 ************************************ 00:11:31.820 END TEST raid_function_test_raid0 00:11:31.820 ************************************ 00:11:31.820 10:19:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:31.820 10:19:08 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:11:31.820 10:19:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:31.820 10:19:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:31.820 10:19:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:31.820 ************************************ 00:11:31.820 START TEST raid_function_test_concat 00:11:31.820 ************************************ 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=472606 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 472606' 00:11:31.820 Process raid pid: 472606 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 472606 /var/tmp/spdk-raid.sock 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 472606 ']' 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:31.820 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:31.820 10:19:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:32.079 [2024-07-15 10:19:09.054151] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:32.079 [2024-07-15 10:19:09.054218] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:32.079 [2024-07-15 10:19:09.184065] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:32.337 [2024-07-15 10:19:09.291103] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:32.337 [2024-07-15 10:19:09.356143] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:32.337 [2024-07-15 10:19:09.356194] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:32.904 10:19:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:32.904 10:19:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:11:32.904 10:19:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:32.904 10:19:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:32.904 10:19:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:32.904 10:19:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:32.904 10:19:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:33.472 [2024-07-15 10:19:10.407749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:33.472 [2024-07-15 10:19:10.409238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:33.472 [2024-07-15 10:19:10.409297] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa18bd0 00:11:33.472 [2024-07-15 10:19:10.409308] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:33.472 [2024-07-15 10:19:10.409491] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa18b10 00:11:33.472 [2024-07-15 10:19:10.409611] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa18bd0 00:11:33.472 [2024-07-15 10:19:10.409621] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xa18bd0 00:11:33.472 [2024-07-15 10:19:10.409722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:33.472 Base_1 00:11:33.472 Base_2 00:11:33.472 10:19:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:33.472 10:19:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:33.472 10:19:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:33.731 [2024-07-15 10:19:10.905077] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbcc8e0 00:11:33.731 /dev/nbd0 00:11:33.731 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:33.989 1+0 records in 00:11:33.989 1+0 records out 00:11:33.989 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267292 s, 15.3 MB/s 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:33.989 10:19:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:34.248 { 00:11:34.248 "nbd_device": "/dev/nbd0", 00:11:34.248 "bdev_name": "raid" 00:11:34.248 } 00:11:34.248 ]' 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:34.248 { 00:11:34.248 "nbd_device": "/dev/nbd0", 00:11:34.248 "bdev_name": "raid" 00:11:34.248 } 00:11:34.248 ]' 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:34.248 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:34.249 4096+0 records in 00:11:34.249 4096+0 records out 00:11:34.249 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0295843 s, 70.9 MB/s 00:11:34.249 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:34.507 4096+0 records in 00:11:34.507 4096+0 records out 00:11:34.507 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.199381 s, 10.5 MB/s 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:34.507 128+0 records in 00:11:34.507 128+0 records out 00:11:34.507 65536 bytes (66 kB, 64 KiB) copied, 0.00088269 s, 74.2 MB/s 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:34.507 2035+0 records in 00:11:34.507 2035+0 records out 00:11:34.507 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0111905 s, 93.1 MB/s 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:34.507 456+0 records in 00:11:34.507 456+0 records out 00:11:34.507 233472 bytes (233 kB, 228 KiB) copied, 0.00279152 s, 83.6 MB/s 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:34.507 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:34.766 [2024-07-15 10:19:11.857619] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:34.766 10:19:11 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 472606 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 472606 ']' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 472606 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 472606 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 472606' 00:11:35.025 killing process with pid 472606 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 472606 00:11:35.025 [2024-07-15 10:19:12.222101] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:35.025 [2024-07-15 10:19:12.222165] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:35.025 [2024-07-15 10:19:12.222208] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:35.025 [2024-07-15 10:19:12.222226] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa18bd0 name raid, state offline 00:11:35.025 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 472606 00:11:35.284 [2024-07-15 10:19:12.239179] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:35.284 10:19:12 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:35.284 00:11:35.284 real 0m3.462s 00:11:35.284 user 0m4.790s 00:11:35.284 sys 0m1.177s 00:11:35.284 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:35.284 10:19:12 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:35.284 ************************************ 00:11:35.284 END TEST raid_function_test_concat 00:11:35.284 ************************************ 00:11:35.543 10:19:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:35.543 10:19:12 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:11:35.543 10:19:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:35.543 10:19:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:35.543 10:19:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:35.543 ************************************ 00:11:35.543 START TEST raid0_resize_test 00:11:35.543 ************************************ 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=473107 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 473107' 00:11:35.543 Process raid pid: 473107 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 473107 /var/tmp/spdk-raid.sock 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 473107 ']' 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:35.543 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:35.543 10:19:12 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.543 [2024-07-15 10:19:12.606659] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:35.543 [2024-07-15 10:19:12.606728] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:35.543 [2024-07-15 10:19:12.739010] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:35.802 [2024-07-15 10:19:12.836431] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:35.802 [2024-07-15 10:19:12.897668] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:35.802 [2024-07-15 10:19:12.897706] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.369 10:19:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:36.369 10:19:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:11:36.369 10:19:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:36.627 Base_1 00:11:36.627 10:19:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:36.884 Base_2 00:11:36.884 10:19:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:37.142 [2024-07-15 10:19:14.175559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:37.142 [2024-07-15 10:19:14.176841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:37.142 [2024-07-15 10:19:14.176899] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2617780 00:11:37.142 [2024-07-15 10:19:14.176909] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:37.142 [2024-07-15 10:19:14.177111] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2163020 00:11:37.142 [2024-07-15 10:19:14.177205] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2617780 00:11:37.142 [2024-07-15 10:19:14.177215] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x2617780 00:11:37.143 [2024-07-15 10:19:14.177320] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:37.143 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:37.401 [2024-07-15 10:19:14.420220] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:37.401 [2024-07-15 10:19:14.420241] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:37.401 true 00:11:37.401 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:37.401 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:11:37.659 [2024-07-15 10:19:14.665024] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:37.659 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:11:37.659 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:11:37.659 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:11:37.659 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:37.917 [2024-07-15 10:19:14.909483] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:37.917 [2024-07-15 10:19:14.909506] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:37.917 [2024-07-15 10:19:14.909533] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:37.917 true 00:11:37.917 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:37.917 10:19:14 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:11:38.176 [2024-07-15 10:19:15.142254] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 473107 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 473107 ']' 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 473107 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:38.176 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 473107 00:11:38.177 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:38.177 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:38.177 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 473107' 00:11:38.177 killing process with pid 473107 00:11:38.177 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 473107 00:11:38.177 [2024-07-15 10:19:15.210766] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:38.177 [2024-07-15 10:19:15.210828] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:38.177 [2024-07-15 10:19:15.210871] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:38.177 [2024-07-15 10:19:15.210883] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2617780 name Raid, state offline 00:11:38.177 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 473107 00:11:38.177 [2024-07-15 10:19:15.212278] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:38.435 10:19:15 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:11:38.435 00:11:38.435 real 0m2.880s 00:11:38.435 user 0m4.444s 00:11:38.435 sys 0m0.601s 00:11:38.435 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:38.435 10:19:15 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.435 ************************************ 00:11:38.435 END TEST raid0_resize_test 00:11:38.435 ************************************ 00:11:38.435 10:19:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:38.435 10:19:15 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:38.435 10:19:15 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:38.435 10:19:15 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:38.435 10:19:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:38.435 10:19:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:38.435 10:19:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:38.435 ************************************ 00:11:38.435 START TEST raid_state_function_test 00:11:38.435 ************************************ 00:11:38.435 10:19:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:11:38.435 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:38.435 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:38.435 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=473598 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 473598' 00:11:38.436 Process raid pid: 473598 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 473598 /var/tmp/spdk-raid.sock 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 473598 ']' 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:38.436 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:38.436 10:19:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.436 [2024-07-15 10:19:15.561318] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:38.436 [2024-07-15 10:19:15.561380] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:38.695 [2024-07-15 10:19:15.691640] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:38.695 [2024-07-15 10:19:15.794912] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:38.695 [2024-07-15 10:19:15.854544] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:38.695 [2024-07-15 10:19:15.854580] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:39.630 [2024-07-15 10:19:16.723164] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:39.630 [2024-07-15 10:19:16.723208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:39.630 [2024-07-15 10:19:16.723219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:39.630 [2024-07-15 10:19:16.723231] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.630 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.888 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.888 "name": "Existed_Raid", 00:11:39.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.888 "strip_size_kb": 64, 00:11:39.888 "state": "configuring", 00:11:39.888 "raid_level": "raid0", 00:11:39.888 "superblock": false, 00:11:39.888 "num_base_bdevs": 2, 00:11:39.888 "num_base_bdevs_discovered": 0, 00:11:39.888 "num_base_bdevs_operational": 2, 00:11:39.888 "base_bdevs_list": [ 00:11:39.888 { 00:11:39.888 "name": "BaseBdev1", 00:11:39.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.888 "is_configured": false, 00:11:39.888 "data_offset": 0, 00:11:39.888 "data_size": 0 00:11:39.888 }, 00:11:39.888 { 00:11:39.888 "name": "BaseBdev2", 00:11:39.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.888 "is_configured": false, 00:11:39.888 "data_offset": 0, 00:11:39.888 "data_size": 0 00:11:39.888 } 00:11:39.888 ] 00:11:39.888 }' 00:11:39.888 10:19:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.888 10:19:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.455 10:19:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:40.713 [2024-07-15 10:19:17.817911] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:40.713 [2024-07-15 10:19:17.817947] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf44a80 name Existed_Raid, state configuring 00:11:40.713 10:19:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:40.971 [2024-07-15 10:19:18.066583] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:40.971 [2024-07-15 10:19:18.066611] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:40.971 [2024-07-15 10:19:18.066621] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:40.971 [2024-07-15 10:19:18.066632] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:40.971 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:41.230 [2024-07-15 10:19:18.256916] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:41.230 BaseBdev1 00:11:41.230 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:41.230 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:41.230 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:41.230 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:41.230 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:41.230 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:41.230 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:41.488 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:41.747 [ 00:11:41.747 { 00:11:41.747 "name": "BaseBdev1", 00:11:41.747 "aliases": [ 00:11:41.747 "c68704ea-041f-458f-8db5-600733e64ff4" 00:11:41.747 ], 00:11:41.747 "product_name": "Malloc disk", 00:11:41.747 "block_size": 512, 00:11:41.747 "num_blocks": 65536, 00:11:41.747 "uuid": "c68704ea-041f-458f-8db5-600733e64ff4", 00:11:41.747 "assigned_rate_limits": { 00:11:41.747 "rw_ios_per_sec": 0, 00:11:41.747 "rw_mbytes_per_sec": 0, 00:11:41.747 "r_mbytes_per_sec": 0, 00:11:41.747 "w_mbytes_per_sec": 0 00:11:41.747 }, 00:11:41.747 "claimed": true, 00:11:41.747 "claim_type": "exclusive_write", 00:11:41.747 "zoned": false, 00:11:41.747 "supported_io_types": { 00:11:41.747 "read": true, 00:11:41.747 "write": true, 00:11:41.747 "unmap": true, 00:11:41.747 "flush": true, 00:11:41.747 "reset": true, 00:11:41.747 "nvme_admin": false, 00:11:41.747 "nvme_io": false, 00:11:41.747 "nvme_io_md": false, 00:11:41.747 "write_zeroes": true, 00:11:41.747 "zcopy": true, 00:11:41.747 "get_zone_info": false, 00:11:41.747 "zone_management": false, 00:11:41.747 "zone_append": false, 00:11:41.747 "compare": false, 00:11:41.747 "compare_and_write": false, 00:11:41.747 "abort": true, 00:11:41.747 "seek_hole": false, 00:11:41.747 "seek_data": false, 00:11:41.747 "copy": true, 00:11:41.747 "nvme_iov_md": false 00:11:41.747 }, 00:11:41.747 "memory_domains": [ 00:11:41.747 { 00:11:41.747 "dma_device_id": "system", 00:11:41.747 "dma_device_type": 1 00:11:41.747 }, 00:11:41.747 { 00:11:41.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.747 "dma_device_type": 2 00:11:41.747 } 00:11:41.747 ], 00:11:41.747 "driver_specific": {} 00:11:41.747 } 00:11:41.747 ] 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.747 10:19:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.006 10:19:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.006 "name": "Existed_Raid", 00:11:42.006 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.006 "strip_size_kb": 64, 00:11:42.006 "state": "configuring", 00:11:42.006 "raid_level": "raid0", 00:11:42.006 "superblock": false, 00:11:42.006 "num_base_bdevs": 2, 00:11:42.006 "num_base_bdevs_discovered": 1, 00:11:42.006 "num_base_bdevs_operational": 2, 00:11:42.006 "base_bdevs_list": [ 00:11:42.006 { 00:11:42.006 "name": "BaseBdev1", 00:11:42.006 "uuid": "c68704ea-041f-458f-8db5-600733e64ff4", 00:11:42.006 "is_configured": true, 00:11:42.006 "data_offset": 0, 00:11:42.006 "data_size": 65536 00:11:42.006 }, 00:11:42.006 { 00:11:42.006 "name": "BaseBdev2", 00:11:42.006 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.007 "is_configured": false, 00:11:42.007 "data_offset": 0, 00:11:42.007 "data_size": 0 00:11:42.007 } 00:11:42.007 ] 00:11:42.007 }' 00:11:42.007 10:19:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.007 10:19:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:42.613 10:19:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:42.613 [2024-07-15 10:19:19.752904] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:42.613 [2024-07-15 10:19:19.752949] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf44350 name Existed_Raid, state configuring 00:11:42.613 10:19:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:42.872 [2024-07-15 10:19:19.997571] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:42.872 [2024-07-15 10:19:19.999055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:42.872 [2024-07-15 10:19:19.999087] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.872 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:43.131 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:43.131 "name": "Existed_Raid", 00:11:43.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.131 "strip_size_kb": 64, 00:11:43.131 "state": "configuring", 00:11:43.131 "raid_level": "raid0", 00:11:43.131 "superblock": false, 00:11:43.131 "num_base_bdevs": 2, 00:11:43.131 "num_base_bdevs_discovered": 1, 00:11:43.131 "num_base_bdevs_operational": 2, 00:11:43.131 "base_bdevs_list": [ 00:11:43.131 { 00:11:43.131 "name": "BaseBdev1", 00:11:43.131 "uuid": "c68704ea-041f-458f-8db5-600733e64ff4", 00:11:43.131 "is_configured": true, 00:11:43.131 "data_offset": 0, 00:11:43.131 "data_size": 65536 00:11:43.131 }, 00:11:43.131 { 00:11:43.131 "name": "BaseBdev2", 00:11:43.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.131 "is_configured": false, 00:11:43.131 "data_offset": 0, 00:11:43.131 "data_size": 0 00:11:43.131 } 00:11:43.131 ] 00:11:43.131 }' 00:11:43.131 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:43.131 10:19:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.699 10:19:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:43.958 [2024-07-15 10:19:21.052908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:43.958 [2024-07-15 10:19:21.052954] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf45000 00:11:43.958 [2024-07-15 10:19:21.052968] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:43.958 [2024-07-15 10:19:21.053160] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe5f0c0 00:11:43.958 [2024-07-15 10:19:21.053282] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf45000 00:11:43.958 [2024-07-15 10:19:21.053293] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf45000 00:11:43.958 [2024-07-15 10:19:21.053460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:43.958 BaseBdev2 00:11:43.958 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:43.958 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:43.958 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:43.958 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:43.958 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:43.958 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:43.958 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:44.217 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:44.476 [ 00:11:44.476 { 00:11:44.476 "name": "BaseBdev2", 00:11:44.476 "aliases": [ 00:11:44.476 "1dc82fff-2286-42dd-8b68-1b819dca0dca" 00:11:44.476 ], 00:11:44.476 "product_name": "Malloc disk", 00:11:44.476 "block_size": 512, 00:11:44.476 "num_blocks": 65536, 00:11:44.476 "uuid": "1dc82fff-2286-42dd-8b68-1b819dca0dca", 00:11:44.476 "assigned_rate_limits": { 00:11:44.476 "rw_ios_per_sec": 0, 00:11:44.476 "rw_mbytes_per_sec": 0, 00:11:44.476 "r_mbytes_per_sec": 0, 00:11:44.476 "w_mbytes_per_sec": 0 00:11:44.476 }, 00:11:44.476 "claimed": true, 00:11:44.476 "claim_type": "exclusive_write", 00:11:44.476 "zoned": false, 00:11:44.476 "supported_io_types": { 00:11:44.476 "read": true, 00:11:44.476 "write": true, 00:11:44.476 "unmap": true, 00:11:44.476 "flush": true, 00:11:44.476 "reset": true, 00:11:44.476 "nvme_admin": false, 00:11:44.476 "nvme_io": false, 00:11:44.476 "nvme_io_md": false, 00:11:44.476 "write_zeroes": true, 00:11:44.476 "zcopy": true, 00:11:44.476 "get_zone_info": false, 00:11:44.476 "zone_management": false, 00:11:44.477 "zone_append": false, 00:11:44.477 "compare": false, 00:11:44.477 "compare_and_write": false, 00:11:44.477 "abort": true, 00:11:44.477 "seek_hole": false, 00:11:44.477 "seek_data": false, 00:11:44.477 "copy": true, 00:11:44.477 "nvme_iov_md": false 00:11:44.477 }, 00:11:44.477 "memory_domains": [ 00:11:44.477 { 00:11:44.477 "dma_device_id": "system", 00:11:44.477 "dma_device_type": 1 00:11:44.477 }, 00:11:44.477 { 00:11:44.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.477 "dma_device_type": 2 00:11:44.477 } 00:11:44.477 ], 00:11:44.477 "driver_specific": {} 00:11:44.477 } 00:11:44.477 ] 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.477 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.736 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.736 "name": "Existed_Raid", 00:11:44.736 "uuid": "e015d32f-8c80-4ab1-88dc-66549c14ee81", 00:11:44.736 "strip_size_kb": 64, 00:11:44.736 "state": "online", 00:11:44.736 "raid_level": "raid0", 00:11:44.736 "superblock": false, 00:11:44.736 "num_base_bdevs": 2, 00:11:44.736 "num_base_bdevs_discovered": 2, 00:11:44.736 "num_base_bdevs_operational": 2, 00:11:44.736 "base_bdevs_list": [ 00:11:44.736 { 00:11:44.736 "name": "BaseBdev1", 00:11:44.736 "uuid": "c68704ea-041f-458f-8db5-600733e64ff4", 00:11:44.736 "is_configured": true, 00:11:44.736 "data_offset": 0, 00:11:44.736 "data_size": 65536 00:11:44.736 }, 00:11:44.736 { 00:11:44.736 "name": "BaseBdev2", 00:11:44.736 "uuid": "1dc82fff-2286-42dd-8b68-1b819dca0dca", 00:11:44.736 "is_configured": true, 00:11:44.736 "data_offset": 0, 00:11:44.736 "data_size": 65536 00:11:44.736 } 00:11:44.736 ] 00:11:44.736 }' 00:11:44.736 10:19:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.736 10:19:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:45.305 [2024-07-15 10:19:22.432844] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:45.305 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:45.305 "name": "Existed_Raid", 00:11:45.305 "aliases": [ 00:11:45.305 "e015d32f-8c80-4ab1-88dc-66549c14ee81" 00:11:45.305 ], 00:11:45.305 "product_name": "Raid Volume", 00:11:45.305 "block_size": 512, 00:11:45.305 "num_blocks": 131072, 00:11:45.305 "uuid": "e015d32f-8c80-4ab1-88dc-66549c14ee81", 00:11:45.305 "assigned_rate_limits": { 00:11:45.305 "rw_ios_per_sec": 0, 00:11:45.305 "rw_mbytes_per_sec": 0, 00:11:45.305 "r_mbytes_per_sec": 0, 00:11:45.305 "w_mbytes_per_sec": 0 00:11:45.305 }, 00:11:45.305 "claimed": false, 00:11:45.305 "zoned": false, 00:11:45.305 "supported_io_types": { 00:11:45.305 "read": true, 00:11:45.305 "write": true, 00:11:45.305 "unmap": true, 00:11:45.305 "flush": true, 00:11:45.305 "reset": true, 00:11:45.305 "nvme_admin": false, 00:11:45.305 "nvme_io": false, 00:11:45.305 "nvme_io_md": false, 00:11:45.305 "write_zeroes": true, 00:11:45.305 "zcopy": false, 00:11:45.305 "get_zone_info": false, 00:11:45.305 "zone_management": false, 00:11:45.305 "zone_append": false, 00:11:45.305 "compare": false, 00:11:45.305 "compare_and_write": false, 00:11:45.305 "abort": false, 00:11:45.305 "seek_hole": false, 00:11:45.305 "seek_data": false, 00:11:45.305 "copy": false, 00:11:45.305 "nvme_iov_md": false 00:11:45.305 }, 00:11:45.305 "memory_domains": [ 00:11:45.305 { 00:11:45.305 "dma_device_id": "system", 00:11:45.305 "dma_device_type": 1 00:11:45.305 }, 00:11:45.305 { 00:11:45.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.305 "dma_device_type": 2 00:11:45.305 }, 00:11:45.305 { 00:11:45.305 "dma_device_id": "system", 00:11:45.305 "dma_device_type": 1 00:11:45.305 }, 00:11:45.305 { 00:11:45.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.305 "dma_device_type": 2 00:11:45.305 } 00:11:45.305 ], 00:11:45.305 "driver_specific": { 00:11:45.305 "raid": { 00:11:45.305 "uuid": "e015d32f-8c80-4ab1-88dc-66549c14ee81", 00:11:45.305 "strip_size_kb": 64, 00:11:45.305 "state": "online", 00:11:45.305 "raid_level": "raid0", 00:11:45.305 "superblock": false, 00:11:45.305 "num_base_bdevs": 2, 00:11:45.305 "num_base_bdevs_discovered": 2, 00:11:45.305 "num_base_bdevs_operational": 2, 00:11:45.305 "base_bdevs_list": [ 00:11:45.305 { 00:11:45.305 "name": "BaseBdev1", 00:11:45.305 "uuid": "c68704ea-041f-458f-8db5-600733e64ff4", 00:11:45.305 "is_configured": true, 00:11:45.305 "data_offset": 0, 00:11:45.305 "data_size": 65536 00:11:45.305 }, 00:11:45.305 { 00:11:45.305 "name": "BaseBdev2", 00:11:45.305 "uuid": "1dc82fff-2286-42dd-8b68-1b819dca0dca", 00:11:45.305 "is_configured": true, 00:11:45.305 "data_offset": 0, 00:11:45.305 "data_size": 65536 00:11:45.306 } 00:11:45.306 ] 00:11:45.306 } 00:11:45.306 } 00:11:45.306 }' 00:11:45.306 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:45.306 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:45.306 BaseBdev2' 00:11:45.306 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:45.306 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:45.306 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:45.565 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:45.565 "name": "BaseBdev1", 00:11:45.565 "aliases": [ 00:11:45.565 "c68704ea-041f-458f-8db5-600733e64ff4" 00:11:45.565 ], 00:11:45.565 "product_name": "Malloc disk", 00:11:45.565 "block_size": 512, 00:11:45.565 "num_blocks": 65536, 00:11:45.565 "uuid": "c68704ea-041f-458f-8db5-600733e64ff4", 00:11:45.565 "assigned_rate_limits": { 00:11:45.565 "rw_ios_per_sec": 0, 00:11:45.565 "rw_mbytes_per_sec": 0, 00:11:45.565 "r_mbytes_per_sec": 0, 00:11:45.565 "w_mbytes_per_sec": 0 00:11:45.565 }, 00:11:45.565 "claimed": true, 00:11:45.565 "claim_type": "exclusive_write", 00:11:45.565 "zoned": false, 00:11:45.565 "supported_io_types": { 00:11:45.565 "read": true, 00:11:45.565 "write": true, 00:11:45.565 "unmap": true, 00:11:45.565 "flush": true, 00:11:45.565 "reset": true, 00:11:45.565 "nvme_admin": false, 00:11:45.565 "nvme_io": false, 00:11:45.565 "nvme_io_md": false, 00:11:45.565 "write_zeroes": true, 00:11:45.565 "zcopy": true, 00:11:45.565 "get_zone_info": false, 00:11:45.565 "zone_management": false, 00:11:45.565 "zone_append": false, 00:11:45.565 "compare": false, 00:11:45.565 "compare_and_write": false, 00:11:45.565 "abort": true, 00:11:45.565 "seek_hole": false, 00:11:45.565 "seek_data": false, 00:11:45.565 "copy": true, 00:11:45.565 "nvme_iov_md": false 00:11:45.565 }, 00:11:45.565 "memory_domains": [ 00:11:45.565 { 00:11:45.565 "dma_device_id": "system", 00:11:45.565 "dma_device_type": 1 00:11:45.565 }, 00:11:45.565 { 00:11:45.565 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:45.565 "dma_device_type": 2 00:11:45.565 } 00:11:45.565 ], 00:11:45.565 "driver_specific": {} 00:11:45.565 }' 00:11:45.565 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:45.825 10:19:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.084 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.084 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:46.084 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:46.084 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:46.084 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:46.342 "name": "BaseBdev2", 00:11:46.342 "aliases": [ 00:11:46.342 "1dc82fff-2286-42dd-8b68-1b819dca0dca" 00:11:46.342 ], 00:11:46.342 "product_name": "Malloc disk", 00:11:46.342 "block_size": 512, 00:11:46.342 "num_blocks": 65536, 00:11:46.342 "uuid": "1dc82fff-2286-42dd-8b68-1b819dca0dca", 00:11:46.342 "assigned_rate_limits": { 00:11:46.342 "rw_ios_per_sec": 0, 00:11:46.342 "rw_mbytes_per_sec": 0, 00:11:46.342 "r_mbytes_per_sec": 0, 00:11:46.342 "w_mbytes_per_sec": 0 00:11:46.342 }, 00:11:46.342 "claimed": true, 00:11:46.342 "claim_type": "exclusive_write", 00:11:46.342 "zoned": false, 00:11:46.342 "supported_io_types": { 00:11:46.342 "read": true, 00:11:46.342 "write": true, 00:11:46.342 "unmap": true, 00:11:46.342 "flush": true, 00:11:46.342 "reset": true, 00:11:46.342 "nvme_admin": false, 00:11:46.342 "nvme_io": false, 00:11:46.342 "nvme_io_md": false, 00:11:46.342 "write_zeroes": true, 00:11:46.342 "zcopy": true, 00:11:46.342 "get_zone_info": false, 00:11:46.342 "zone_management": false, 00:11:46.342 "zone_append": false, 00:11:46.342 "compare": false, 00:11:46.342 "compare_and_write": false, 00:11:46.342 "abort": true, 00:11:46.342 "seek_hole": false, 00:11:46.342 "seek_data": false, 00:11:46.342 "copy": true, 00:11:46.342 "nvme_iov_md": false 00:11:46.342 }, 00:11:46.342 "memory_domains": [ 00:11:46.342 { 00:11:46.342 "dma_device_id": "system", 00:11:46.342 "dma_device_type": 1 00:11:46.342 }, 00:11:46.342 { 00:11:46.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.342 "dma_device_type": 2 00:11:46.342 } 00:11:46.342 ], 00:11:46.342 "driver_specific": {} 00:11:46.342 }' 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.342 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:46.600 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:46.600 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.600 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:46.600 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:46.600 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:46.859 [2024-07-15 10:19:23.884476] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:46.859 [2024-07-15 10:19:23.884504] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:46.859 [2024-07-15 10:19:23.884545] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.859 10:19:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:47.118 10:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:47.118 "name": "Existed_Raid", 00:11:47.118 "uuid": "e015d32f-8c80-4ab1-88dc-66549c14ee81", 00:11:47.118 "strip_size_kb": 64, 00:11:47.118 "state": "offline", 00:11:47.118 "raid_level": "raid0", 00:11:47.118 "superblock": false, 00:11:47.118 "num_base_bdevs": 2, 00:11:47.118 "num_base_bdevs_discovered": 1, 00:11:47.118 "num_base_bdevs_operational": 1, 00:11:47.118 "base_bdevs_list": [ 00:11:47.118 { 00:11:47.118 "name": null, 00:11:47.118 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:47.118 "is_configured": false, 00:11:47.118 "data_offset": 0, 00:11:47.118 "data_size": 65536 00:11:47.118 }, 00:11:47.118 { 00:11:47.118 "name": "BaseBdev2", 00:11:47.118 "uuid": "1dc82fff-2286-42dd-8b68-1b819dca0dca", 00:11:47.118 "is_configured": true, 00:11:47.118 "data_offset": 0, 00:11:47.118 "data_size": 65536 00:11:47.118 } 00:11:47.118 ] 00:11:47.118 }' 00:11:47.118 10:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:47.118 10:19:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:47.686 10:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:47.686 10:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:47.686 10:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.686 10:19:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:47.945 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:47.945 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:47.945 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:48.514 [2024-07-15 10:19:25.486609] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:48.514 [2024-07-15 10:19:25.486661] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf45000 name Existed_Raid, state offline 00:11:48.514 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:48.514 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:48.514 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:48.514 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 473598 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 473598 ']' 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 473598 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 473598 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 473598' 00:11:48.773 killing process with pid 473598 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 473598 00:11:48.773 [2024-07-15 10:19:25.833277] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:48.773 10:19:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 473598 00:11:48.773 [2024-07-15 10:19:25.834263] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:49.032 00:11:49.032 real 0m10.556s 00:11:49.032 user 0m18.734s 00:11:49.032 sys 0m1.969s 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.032 ************************************ 00:11:49.032 END TEST raid_state_function_test 00:11:49.032 ************************************ 00:11:49.032 10:19:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:49.032 10:19:26 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:49.032 10:19:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:49.032 10:19:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:49.032 10:19:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:49.032 ************************************ 00:11:49.032 START TEST raid_state_function_test_sb 00:11:49.032 ************************************ 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=475160 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 475160' 00:11:49.032 Process raid pid: 475160 00:11:49.032 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:49.033 10:19:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 475160 /var/tmp/spdk-raid.sock 00:11:49.033 10:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 475160 ']' 00:11:49.033 10:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:49.033 10:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:49.033 10:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:49.033 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:49.033 10:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:49.033 10:19:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:49.033 [2024-07-15 10:19:26.196246] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:49.033 [2024-07-15 10:19:26.196312] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:49.292 [2024-07-15 10:19:26.323719] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:49.292 [2024-07-15 10:19:26.426212] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:49.292 [2024-07-15 10:19:26.490213] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:49.292 [2024-07-15 10:19:26.490248] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:50.227 [2024-07-15 10:19:27.349215] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:50.227 [2024-07-15 10:19:27.349260] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:50.227 [2024-07-15 10:19:27.349271] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:50.227 [2024-07-15 10:19:27.349283] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.227 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.484 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.484 "name": "Existed_Raid", 00:11:50.484 "uuid": "96add1d0-8cf3-4f01-bb1a-27d818016571", 00:11:50.484 "strip_size_kb": 64, 00:11:50.484 "state": "configuring", 00:11:50.484 "raid_level": "raid0", 00:11:50.484 "superblock": true, 00:11:50.484 "num_base_bdevs": 2, 00:11:50.484 "num_base_bdevs_discovered": 0, 00:11:50.484 "num_base_bdevs_operational": 2, 00:11:50.484 "base_bdevs_list": [ 00:11:50.484 { 00:11:50.484 "name": "BaseBdev1", 00:11:50.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.484 "is_configured": false, 00:11:50.484 "data_offset": 0, 00:11:50.484 "data_size": 0 00:11:50.484 }, 00:11:50.484 { 00:11:50.484 "name": "BaseBdev2", 00:11:50.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.484 "is_configured": false, 00:11:50.484 "data_offset": 0, 00:11:50.484 "data_size": 0 00:11:50.484 } 00:11:50.484 ] 00:11:50.484 }' 00:11:50.484 10:19:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.484 10:19:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:51.048 10:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:51.306 [2024-07-15 10:19:28.399866] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:51.306 [2024-07-15 10:19:28.399896] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc6fa80 name Existed_Raid, state configuring 00:11:51.306 10:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:51.565 [2024-07-15 10:19:28.644575] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:51.565 [2024-07-15 10:19:28.644601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:51.565 [2024-07-15 10:19:28.644611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:51.565 [2024-07-15 10:19:28.644622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:51.565 10:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:51.822 [2024-07-15 10:19:28.895095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:51.822 BaseBdev1 00:11:51.823 10:19:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:51.823 10:19:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:51.823 10:19:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:51.823 10:19:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:51.823 10:19:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:51.823 10:19:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:51.823 10:19:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:52.080 10:19:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:52.338 [ 00:11:52.338 { 00:11:52.338 "name": "BaseBdev1", 00:11:52.338 "aliases": [ 00:11:52.338 "0f9cdfec-ac18-448d-b98e-1343aeaf86a3" 00:11:52.338 ], 00:11:52.338 "product_name": "Malloc disk", 00:11:52.338 "block_size": 512, 00:11:52.338 "num_blocks": 65536, 00:11:52.338 "uuid": "0f9cdfec-ac18-448d-b98e-1343aeaf86a3", 00:11:52.338 "assigned_rate_limits": { 00:11:52.338 "rw_ios_per_sec": 0, 00:11:52.338 "rw_mbytes_per_sec": 0, 00:11:52.338 "r_mbytes_per_sec": 0, 00:11:52.338 "w_mbytes_per_sec": 0 00:11:52.338 }, 00:11:52.338 "claimed": true, 00:11:52.338 "claim_type": "exclusive_write", 00:11:52.338 "zoned": false, 00:11:52.338 "supported_io_types": { 00:11:52.338 "read": true, 00:11:52.338 "write": true, 00:11:52.338 "unmap": true, 00:11:52.338 "flush": true, 00:11:52.338 "reset": true, 00:11:52.338 "nvme_admin": false, 00:11:52.338 "nvme_io": false, 00:11:52.338 "nvme_io_md": false, 00:11:52.338 "write_zeroes": true, 00:11:52.338 "zcopy": true, 00:11:52.338 "get_zone_info": false, 00:11:52.338 "zone_management": false, 00:11:52.338 "zone_append": false, 00:11:52.338 "compare": false, 00:11:52.338 "compare_and_write": false, 00:11:52.338 "abort": true, 00:11:52.338 "seek_hole": false, 00:11:52.338 "seek_data": false, 00:11:52.338 "copy": true, 00:11:52.338 "nvme_iov_md": false 00:11:52.338 }, 00:11:52.338 "memory_domains": [ 00:11:52.338 { 00:11:52.338 "dma_device_id": "system", 00:11:52.338 "dma_device_type": 1 00:11:52.338 }, 00:11:52.338 { 00:11:52.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.338 "dma_device_type": 2 00:11:52.338 } 00:11:52.338 ], 00:11:52.338 "driver_specific": {} 00:11:52.338 } 00:11:52.338 ] 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.338 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:52.596 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.596 "name": "Existed_Raid", 00:11:52.596 "uuid": "1a34f873-e416-41a4-9801-a6166833a453", 00:11:52.596 "strip_size_kb": 64, 00:11:52.596 "state": "configuring", 00:11:52.596 "raid_level": "raid0", 00:11:52.596 "superblock": true, 00:11:52.596 "num_base_bdevs": 2, 00:11:52.596 "num_base_bdevs_discovered": 1, 00:11:52.596 "num_base_bdevs_operational": 2, 00:11:52.596 "base_bdevs_list": [ 00:11:52.596 { 00:11:52.596 "name": "BaseBdev1", 00:11:52.596 "uuid": "0f9cdfec-ac18-448d-b98e-1343aeaf86a3", 00:11:52.596 "is_configured": true, 00:11:52.596 "data_offset": 2048, 00:11:52.596 "data_size": 63488 00:11:52.596 }, 00:11:52.596 { 00:11:52.596 "name": "BaseBdev2", 00:11:52.596 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.596 "is_configured": false, 00:11:52.596 "data_offset": 0, 00:11:52.596 "data_size": 0 00:11:52.596 } 00:11:52.596 ] 00:11:52.596 }' 00:11:52.596 10:19:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.596 10:19:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:53.162 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:53.420 [2024-07-15 10:19:30.391101] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:53.420 [2024-07-15 10:19:30.391142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc6f350 name Existed_Raid, state configuring 00:11:53.420 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:53.678 [2024-07-15 10:19:30.631775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:53.678 [2024-07-15 10:19:30.633278] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:53.678 [2024-07-15 10:19:30.633312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.678 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.936 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.936 "name": "Existed_Raid", 00:11:53.936 "uuid": "a90a78e5-b5fa-479e-98be-1aee57cf23ec", 00:11:53.936 "strip_size_kb": 64, 00:11:53.936 "state": "configuring", 00:11:53.936 "raid_level": "raid0", 00:11:53.936 "superblock": true, 00:11:53.936 "num_base_bdevs": 2, 00:11:53.936 "num_base_bdevs_discovered": 1, 00:11:53.936 "num_base_bdevs_operational": 2, 00:11:53.936 "base_bdevs_list": [ 00:11:53.936 { 00:11:53.936 "name": "BaseBdev1", 00:11:53.936 "uuid": "0f9cdfec-ac18-448d-b98e-1343aeaf86a3", 00:11:53.936 "is_configured": true, 00:11:53.936 "data_offset": 2048, 00:11:53.936 "data_size": 63488 00:11:53.936 }, 00:11:53.936 { 00:11:53.936 "name": "BaseBdev2", 00:11:53.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.936 "is_configured": false, 00:11:53.936 "data_offset": 0, 00:11:53.936 "data_size": 0 00:11:53.936 } 00:11:53.936 ] 00:11:53.936 }' 00:11:53.936 10:19:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.936 10:19:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.504 10:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:54.504 [2024-07-15 10:19:31.697887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:54.504 [2024-07-15 10:19:31.698045] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc70000 00:11:54.504 [2024-07-15 10:19:31.698059] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:54.504 [2024-07-15 10:19:31.698237] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb8a0c0 00:11:54.504 [2024-07-15 10:19:31.698352] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc70000 00:11:54.504 [2024-07-15 10:19:31.698362] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc70000 00:11:54.504 [2024-07-15 10:19:31.698453] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:54.504 BaseBdev2 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:54.762 10:19:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:55.021 [ 00:11:55.021 { 00:11:55.021 "name": "BaseBdev2", 00:11:55.021 "aliases": [ 00:11:55.021 "dbe5277f-7f40-48ed-b433-bf4f4248f4e9" 00:11:55.021 ], 00:11:55.021 "product_name": "Malloc disk", 00:11:55.021 "block_size": 512, 00:11:55.021 "num_blocks": 65536, 00:11:55.021 "uuid": "dbe5277f-7f40-48ed-b433-bf4f4248f4e9", 00:11:55.021 "assigned_rate_limits": { 00:11:55.021 "rw_ios_per_sec": 0, 00:11:55.021 "rw_mbytes_per_sec": 0, 00:11:55.021 "r_mbytes_per_sec": 0, 00:11:55.021 "w_mbytes_per_sec": 0 00:11:55.021 }, 00:11:55.021 "claimed": true, 00:11:55.021 "claim_type": "exclusive_write", 00:11:55.021 "zoned": false, 00:11:55.021 "supported_io_types": { 00:11:55.021 "read": true, 00:11:55.021 "write": true, 00:11:55.021 "unmap": true, 00:11:55.021 "flush": true, 00:11:55.021 "reset": true, 00:11:55.021 "nvme_admin": false, 00:11:55.021 "nvme_io": false, 00:11:55.021 "nvme_io_md": false, 00:11:55.021 "write_zeroes": true, 00:11:55.021 "zcopy": true, 00:11:55.021 "get_zone_info": false, 00:11:55.021 "zone_management": false, 00:11:55.021 "zone_append": false, 00:11:55.021 "compare": false, 00:11:55.021 "compare_and_write": false, 00:11:55.021 "abort": true, 00:11:55.021 "seek_hole": false, 00:11:55.021 "seek_data": false, 00:11:55.021 "copy": true, 00:11:55.021 "nvme_iov_md": false 00:11:55.021 }, 00:11:55.021 "memory_domains": [ 00:11:55.021 { 00:11:55.021 "dma_device_id": "system", 00:11:55.021 "dma_device_type": 1 00:11:55.021 }, 00:11:55.021 { 00:11:55.021 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.021 "dma_device_type": 2 00:11:55.021 } 00:11:55.021 ], 00:11:55.021 "driver_specific": {} 00:11:55.021 } 00:11:55.021 ] 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.021 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.279 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.279 "name": "Existed_Raid", 00:11:55.279 "uuid": "a90a78e5-b5fa-479e-98be-1aee57cf23ec", 00:11:55.279 "strip_size_kb": 64, 00:11:55.279 "state": "online", 00:11:55.279 "raid_level": "raid0", 00:11:55.279 "superblock": true, 00:11:55.279 "num_base_bdevs": 2, 00:11:55.279 "num_base_bdevs_discovered": 2, 00:11:55.279 "num_base_bdevs_operational": 2, 00:11:55.279 "base_bdevs_list": [ 00:11:55.279 { 00:11:55.279 "name": "BaseBdev1", 00:11:55.279 "uuid": "0f9cdfec-ac18-448d-b98e-1343aeaf86a3", 00:11:55.279 "is_configured": true, 00:11:55.279 "data_offset": 2048, 00:11:55.279 "data_size": 63488 00:11:55.279 }, 00:11:55.279 { 00:11:55.279 "name": "BaseBdev2", 00:11:55.280 "uuid": "dbe5277f-7f40-48ed-b433-bf4f4248f4e9", 00:11:55.280 "is_configured": true, 00:11:55.280 "data_offset": 2048, 00:11:55.280 "data_size": 63488 00:11:55.280 } 00:11:55.280 ] 00:11:55.280 }' 00:11:55.280 10:19:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.280 10:19:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:56.218 [2024-07-15 10:19:33.210188] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:56.218 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:56.218 "name": "Existed_Raid", 00:11:56.218 "aliases": [ 00:11:56.218 "a90a78e5-b5fa-479e-98be-1aee57cf23ec" 00:11:56.218 ], 00:11:56.218 "product_name": "Raid Volume", 00:11:56.218 "block_size": 512, 00:11:56.218 "num_blocks": 126976, 00:11:56.218 "uuid": "a90a78e5-b5fa-479e-98be-1aee57cf23ec", 00:11:56.218 "assigned_rate_limits": { 00:11:56.218 "rw_ios_per_sec": 0, 00:11:56.218 "rw_mbytes_per_sec": 0, 00:11:56.218 "r_mbytes_per_sec": 0, 00:11:56.218 "w_mbytes_per_sec": 0 00:11:56.218 }, 00:11:56.218 "claimed": false, 00:11:56.218 "zoned": false, 00:11:56.218 "supported_io_types": { 00:11:56.218 "read": true, 00:11:56.218 "write": true, 00:11:56.218 "unmap": true, 00:11:56.218 "flush": true, 00:11:56.218 "reset": true, 00:11:56.218 "nvme_admin": false, 00:11:56.218 "nvme_io": false, 00:11:56.218 "nvme_io_md": false, 00:11:56.218 "write_zeroes": true, 00:11:56.218 "zcopy": false, 00:11:56.218 "get_zone_info": false, 00:11:56.218 "zone_management": false, 00:11:56.218 "zone_append": false, 00:11:56.218 "compare": false, 00:11:56.218 "compare_and_write": false, 00:11:56.218 "abort": false, 00:11:56.218 "seek_hole": false, 00:11:56.218 "seek_data": false, 00:11:56.218 "copy": false, 00:11:56.218 "nvme_iov_md": false 00:11:56.218 }, 00:11:56.218 "memory_domains": [ 00:11:56.218 { 00:11:56.218 "dma_device_id": "system", 00:11:56.218 "dma_device_type": 1 00:11:56.218 }, 00:11:56.218 { 00:11:56.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.218 "dma_device_type": 2 00:11:56.219 }, 00:11:56.219 { 00:11:56.219 "dma_device_id": "system", 00:11:56.219 "dma_device_type": 1 00:11:56.219 }, 00:11:56.219 { 00:11:56.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.219 "dma_device_type": 2 00:11:56.219 } 00:11:56.219 ], 00:11:56.219 "driver_specific": { 00:11:56.219 "raid": { 00:11:56.219 "uuid": "a90a78e5-b5fa-479e-98be-1aee57cf23ec", 00:11:56.219 "strip_size_kb": 64, 00:11:56.219 "state": "online", 00:11:56.219 "raid_level": "raid0", 00:11:56.219 "superblock": true, 00:11:56.219 "num_base_bdevs": 2, 00:11:56.219 "num_base_bdevs_discovered": 2, 00:11:56.219 "num_base_bdevs_operational": 2, 00:11:56.219 "base_bdevs_list": [ 00:11:56.219 { 00:11:56.219 "name": "BaseBdev1", 00:11:56.219 "uuid": "0f9cdfec-ac18-448d-b98e-1343aeaf86a3", 00:11:56.219 "is_configured": true, 00:11:56.219 "data_offset": 2048, 00:11:56.219 "data_size": 63488 00:11:56.219 }, 00:11:56.219 { 00:11:56.219 "name": "BaseBdev2", 00:11:56.219 "uuid": "dbe5277f-7f40-48ed-b433-bf4f4248f4e9", 00:11:56.219 "is_configured": true, 00:11:56.219 "data_offset": 2048, 00:11:56.219 "data_size": 63488 00:11:56.219 } 00:11:56.219 ] 00:11:56.219 } 00:11:56.219 } 00:11:56.219 }' 00:11:56.219 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:56.219 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:56.219 BaseBdev2' 00:11:56.219 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:56.219 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:56.219 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:56.219 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:56.219 "name": "BaseBdev1", 00:11:56.219 "aliases": [ 00:11:56.219 "0f9cdfec-ac18-448d-b98e-1343aeaf86a3" 00:11:56.219 ], 00:11:56.219 "product_name": "Malloc disk", 00:11:56.219 "block_size": 512, 00:11:56.219 "num_blocks": 65536, 00:11:56.219 "uuid": "0f9cdfec-ac18-448d-b98e-1343aeaf86a3", 00:11:56.219 "assigned_rate_limits": { 00:11:56.219 "rw_ios_per_sec": 0, 00:11:56.219 "rw_mbytes_per_sec": 0, 00:11:56.219 "r_mbytes_per_sec": 0, 00:11:56.219 "w_mbytes_per_sec": 0 00:11:56.219 }, 00:11:56.219 "claimed": true, 00:11:56.219 "claim_type": "exclusive_write", 00:11:56.219 "zoned": false, 00:11:56.219 "supported_io_types": { 00:11:56.219 "read": true, 00:11:56.219 "write": true, 00:11:56.219 "unmap": true, 00:11:56.219 "flush": true, 00:11:56.219 "reset": true, 00:11:56.219 "nvme_admin": false, 00:11:56.219 "nvme_io": false, 00:11:56.219 "nvme_io_md": false, 00:11:56.219 "write_zeroes": true, 00:11:56.219 "zcopy": true, 00:11:56.219 "get_zone_info": false, 00:11:56.219 "zone_management": false, 00:11:56.219 "zone_append": false, 00:11:56.219 "compare": false, 00:11:56.219 "compare_and_write": false, 00:11:56.219 "abort": true, 00:11:56.219 "seek_hole": false, 00:11:56.219 "seek_data": false, 00:11:56.219 "copy": true, 00:11:56.219 "nvme_iov_md": false 00:11:56.219 }, 00:11:56.219 "memory_domains": [ 00:11:56.219 { 00:11:56.219 "dma_device_id": "system", 00:11:56.219 "dma_device_type": 1 00:11:56.219 }, 00:11:56.219 { 00:11:56.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:56.219 "dma_device_type": 2 00:11:56.219 } 00:11:56.219 ], 00:11:56.219 "driver_specific": {} 00:11:56.219 }' 00:11:56.219 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.476 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:56.476 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:56.476 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.476 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:56.476 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:56.476 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:56.476 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:56.768 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:56.768 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.768 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:56.768 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:56.768 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:56.768 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:56.768 10:19:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:57.026 "name": "BaseBdev2", 00:11:57.026 "aliases": [ 00:11:57.026 "dbe5277f-7f40-48ed-b433-bf4f4248f4e9" 00:11:57.026 ], 00:11:57.026 "product_name": "Malloc disk", 00:11:57.026 "block_size": 512, 00:11:57.026 "num_blocks": 65536, 00:11:57.026 "uuid": "dbe5277f-7f40-48ed-b433-bf4f4248f4e9", 00:11:57.026 "assigned_rate_limits": { 00:11:57.026 "rw_ios_per_sec": 0, 00:11:57.026 "rw_mbytes_per_sec": 0, 00:11:57.026 "r_mbytes_per_sec": 0, 00:11:57.026 "w_mbytes_per_sec": 0 00:11:57.026 }, 00:11:57.026 "claimed": true, 00:11:57.026 "claim_type": "exclusive_write", 00:11:57.026 "zoned": false, 00:11:57.026 "supported_io_types": { 00:11:57.026 "read": true, 00:11:57.026 "write": true, 00:11:57.026 "unmap": true, 00:11:57.026 "flush": true, 00:11:57.026 "reset": true, 00:11:57.026 "nvme_admin": false, 00:11:57.026 "nvme_io": false, 00:11:57.026 "nvme_io_md": false, 00:11:57.026 "write_zeroes": true, 00:11:57.026 "zcopy": true, 00:11:57.026 "get_zone_info": false, 00:11:57.026 "zone_management": false, 00:11:57.026 "zone_append": false, 00:11:57.026 "compare": false, 00:11:57.026 "compare_and_write": false, 00:11:57.026 "abort": true, 00:11:57.026 "seek_hole": false, 00:11:57.026 "seek_data": false, 00:11:57.026 "copy": true, 00:11:57.026 "nvme_iov_md": false 00:11:57.026 }, 00:11:57.026 "memory_domains": [ 00:11:57.026 { 00:11:57.026 "dma_device_id": "system", 00:11:57.026 "dma_device_type": 1 00:11:57.026 }, 00:11:57.026 { 00:11:57.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.026 "dma_device_type": 2 00:11:57.026 } 00:11:57.026 ], 00:11:57.026 "driver_specific": {} 00:11:57.026 }' 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:57.026 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:57.285 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:57.286 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:57.286 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:57.286 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:57.286 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:57.286 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:57.545 [2024-07-15 10:19:34.521449] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:57.545 [2024-07-15 10:19:34.521476] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:57.545 [2024-07-15 10:19:34.521517] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.545 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.804 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.804 "name": "Existed_Raid", 00:11:57.804 "uuid": "a90a78e5-b5fa-479e-98be-1aee57cf23ec", 00:11:57.804 "strip_size_kb": 64, 00:11:57.804 "state": "offline", 00:11:57.804 "raid_level": "raid0", 00:11:57.804 "superblock": true, 00:11:57.804 "num_base_bdevs": 2, 00:11:57.804 "num_base_bdevs_discovered": 1, 00:11:57.804 "num_base_bdevs_operational": 1, 00:11:57.804 "base_bdevs_list": [ 00:11:57.804 { 00:11:57.804 "name": null, 00:11:57.804 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.805 "is_configured": false, 00:11:57.805 "data_offset": 2048, 00:11:57.805 "data_size": 63488 00:11:57.805 }, 00:11:57.805 { 00:11:57.805 "name": "BaseBdev2", 00:11:57.805 "uuid": "dbe5277f-7f40-48ed-b433-bf4f4248f4e9", 00:11:57.805 "is_configured": true, 00:11:57.805 "data_offset": 2048, 00:11:57.805 "data_size": 63488 00:11:57.805 } 00:11:57.805 ] 00:11:57.805 }' 00:11:57.805 10:19:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.805 10:19:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:58.374 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:58.374 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:58.374 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.374 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:58.633 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:58.633 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:58.633 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:58.892 [2024-07-15 10:19:35.846814] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:58.892 [2024-07-15 10:19:35.846863] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc70000 name Existed_Raid, state offline 00:11:58.892 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:58.892 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:58.892 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.892 10:19:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 475160 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 475160 ']' 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 475160 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 475160 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 475160' 00:11:59.151 killing process with pid 475160 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 475160 00:11:59.151 [2024-07-15 10:19:36.176942] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:59.151 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 475160 00:11:59.151 [2024-07-15 10:19:36.177904] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:59.411 10:19:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:59.411 00:11:59.411 real 0m10.273s 00:11:59.411 user 0m18.362s 00:11:59.411 sys 0m1.803s 00:11:59.411 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:59.411 10:19:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:59.411 ************************************ 00:11:59.411 END TEST raid_state_function_test_sb 00:11:59.411 ************************************ 00:11:59.411 10:19:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:59.411 10:19:36 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:59.411 10:19:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:11:59.411 10:19:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:59.411 10:19:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:59.411 ************************************ 00:11:59.411 START TEST raid_superblock_test 00:11:59.411 ************************************ 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=476707 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 476707 /var/tmp/spdk-raid.sock 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 476707 ']' 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:59.411 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:59.411 10:19:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.411 [2024-07-15 10:19:36.557837] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:11:59.411 [2024-07-15 10:19:36.557909] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476707 ] 00:11:59.671 [2024-07-15 10:19:36.686647] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:59.671 [2024-07-15 10:19:36.789203] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:59.671 [2024-07-15 10:19:36.856201] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:59.671 [2024-07-15 10:19:36.856245] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:00.239 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:00.499 malloc1 00:12:00.499 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:00.759 [2024-07-15 10:19:37.898909] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:00.759 [2024-07-15 10:19:37.898963] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:00.759 [2024-07-15 10:19:37.898983] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c8c570 00:12:00.759 [2024-07-15 10:19:37.898996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:00.759 [2024-07-15 10:19:37.900617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:00.759 [2024-07-15 10:19:37.900644] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:00.759 pt1 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:00.759 10:19:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:01.019 malloc2 00:12:01.019 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:01.279 [2024-07-15 10:19:38.393136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:01.279 [2024-07-15 10:19:38.393182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:01.279 [2024-07-15 10:19:38.393200] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c8d970 00:12:01.279 [2024-07-15 10:19:38.393212] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:01.279 [2024-07-15 10:19:38.394737] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:01.279 [2024-07-15 10:19:38.394766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:01.279 pt2 00:12:01.279 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:01.279 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:01.279 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:01.538 [2024-07-15 10:19:38.637804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:01.538 [2024-07-15 10:19:38.639026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:01.538 [2024-07-15 10:19:38.639166] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e30270 00:12:01.538 [2024-07-15 10:19:38.639179] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:01.538 [2024-07-15 10:19:38.639372] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e25c10 00:12:01.538 [2024-07-15 10:19:38.639516] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e30270 00:12:01.538 [2024-07-15 10:19:38.639530] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e30270 00:12:01.538 [2024-07-15 10:19:38.639629] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.538 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:01.797 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:01.797 "name": "raid_bdev1", 00:12:01.797 "uuid": "a8e26685-59e8-436a-8a2d-0b2218a6e3a0", 00:12:01.797 "strip_size_kb": 64, 00:12:01.797 "state": "online", 00:12:01.797 "raid_level": "raid0", 00:12:01.797 "superblock": true, 00:12:01.797 "num_base_bdevs": 2, 00:12:01.797 "num_base_bdevs_discovered": 2, 00:12:01.797 "num_base_bdevs_operational": 2, 00:12:01.797 "base_bdevs_list": [ 00:12:01.797 { 00:12:01.797 "name": "pt1", 00:12:01.797 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:01.797 "is_configured": true, 00:12:01.797 "data_offset": 2048, 00:12:01.797 "data_size": 63488 00:12:01.797 }, 00:12:01.797 { 00:12:01.797 "name": "pt2", 00:12:01.797 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:01.797 "is_configured": true, 00:12:01.797 "data_offset": 2048, 00:12:01.797 "data_size": 63488 00:12:01.797 } 00:12:01.797 ] 00:12:01.797 }' 00:12:01.797 10:19:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:01.797 10:19:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:02.364 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:02.622 [2024-07-15 10:19:39.732939] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:02.622 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:02.622 "name": "raid_bdev1", 00:12:02.622 "aliases": [ 00:12:02.622 "a8e26685-59e8-436a-8a2d-0b2218a6e3a0" 00:12:02.622 ], 00:12:02.622 "product_name": "Raid Volume", 00:12:02.622 "block_size": 512, 00:12:02.622 "num_blocks": 126976, 00:12:02.622 "uuid": "a8e26685-59e8-436a-8a2d-0b2218a6e3a0", 00:12:02.622 "assigned_rate_limits": { 00:12:02.622 "rw_ios_per_sec": 0, 00:12:02.622 "rw_mbytes_per_sec": 0, 00:12:02.622 "r_mbytes_per_sec": 0, 00:12:02.622 "w_mbytes_per_sec": 0 00:12:02.622 }, 00:12:02.622 "claimed": false, 00:12:02.622 "zoned": false, 00:12:02.622 "supported_io_types": { 00:12:02.622 "read": true, 00:12:02.622 "write": true, 00:12:02.622 "unmap": true, 00:12:02.622 "flush": true, 00:12:02.622 "reset": true, 00:12:02.622 "nvme_admin": false, 00:12:02.622 "nvme_io": false, 00:12:02.622 "nvme_io_md": false, 00:12:02.622 "write_zeroes": true, 00:12:02.622 "zcopy": false, 00:12:02.622 "get_zone_info": false, 00:12:02.622 "zone_management": false, 00:12:02.622 "zone_append": false, 00:12:02.622 "compare": false, 00:12:02.622 "compare_and_write": false, 00:12:02.622 "abort": false, 00:12:02.622 "seek_hole": false, 00:12:02.622 "seek_data": false, 00:12:02.622 "copy": false, 00:12:02.622 "nvme_iov_md": false 00:12:02.622 }, 00:12:02.622 "memory_domains": [ 00:12:02.622 { 00:12:02.622 "dma_device_id": "system", 00:12:02.622 "dma_device_type": 1 00:12:02.622 }, 00:12:02.622 { 00:12:02.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.622 "dma_device_type": 2 00:12:02.622 }, 00:12:02.622 { 00:12:02.622 "dma_device_id": "system", 00:12:02.622 "dma_device_type": 1 00:12:02.622 }, 00:12:02.622 { 00:12:02.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.622 "dma_device_type": 2 00:12:02.622 } 00:12:02.622 ], 00:12:02.622 "driver_specific": { 00:12:02.622 "raid": { 00:12:02.622 "uuid": "a8e26685-59e8-436a-8a2d-0b2218a6e3a0", 00:12:02.622 "strip_size_kb": 64, 00:12:02.622 "state": "online", 00:12:02.622 "raid_level": "raid0", 00:12:02.622 "superblock": true, 00:12:02.622 "num_base_bdevs": 2, 00:12:02.622 "num_base_bdevs_discovered": 2, 00:12:02.622 "num_base_bdevs_operational": 2, 00:12:02.622 "base_bdevs_list": [ 00:12:02.622 { 00:12:02.622 "name": "pt1", 00:12:02.622 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:02.622 "is_configured": true, 00:12:02.622 "data_offset": 2048, 00:12:02.622 "data_size": 63488 00:12:02.622 }, 00:12:02.622 { 00:12:02.622 "name": "pt2", 00:12:02.622 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:02.622 "is_configured": true, 00:12:02.622 "data_offset": 2048, 00:12:02.622 "data_size": 63488 00:12:02.622 } 00:12:02.622 ] 00:12:02.622 } 00:12:02.622 } 00:12:02.622 }' 00:12:02.622 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:02.622 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:02.622 pt2' 00:12:02.622 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:02.622 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:02.622 10:19:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:02.881 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:02.881 "name": "pt1", 00:12:02.881 "aliases": [ 00:12:02.881 "00000000-0000-0000-0000-000000000001" 00:12:02.881 ], 00:12:02.881 "product_name": "passthru", 00:12:02.881 "block_size": 512, 00:12:02.881 "num_blocks": 65536, 00:12:02.881 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:02.881 "assigned_rate_limits": { 00:12:02.881 "rw_ios_per_sec": 0, 00:12:02.881 "rw_mbytes_per_sec": 0, 00:12:02.881 "r_mbytes_per_sec": 0, 00:12:02.881 "w_mbytes_per_sec": 0 00:12:02.881 }, 00:12:02.881 "claimed": true, 00:12:02.881 "claim_type": "exclusive_write", 00:12:02.881 "zoned": false, 00:12:02.881 "supported_io_types": { 00:12:02.881 "read": true, 00:12:02.881 "write": true, 00:12:02.881 "unmap": true, 00:12:02.881 "flush": true, 00:12:02.881 "reset": true, 00:12:02.881 "nvme_admin": false, 00:12:02.881 "nvme_io": false, 00:12:02.881 "nvme_io_md": false, 00:12:02.881 "write_zeroes": true, 00:12:02.881 "zcopy": true, 00:12:02.881 "get_zone_info": false, 00:12:02.881 "zone_management": false, 00:12:02.881 "zone_append": false, 00:12:02.881 "compare": false, 00:12:02.881 "compare_and_write": false, 00:12:02.881 "abort": true, 00:12:02.881 "seek_hole": false, 00:12:02.881 "seek_data": false, 00:12:02.881 "copy": true, 00:12:02.881 "nvme_iov_md": false 00:12:02.881 }, 00:12:02.881 "memory_domains": [ 00:12:02.881 { 00:12:02.881 "dma_device_id": "system", 00:12:02.881 "dma_device_type": 1 00:12:02.881 }, 00:12:02.881 { 00:12:02.881 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.881 "dma_device_type": 2 00:12:02.881 } 00:12:02.881 ], 00:12:02.881 "driver_specific": { 00:12:02.881 "passthru": { 00:12:02.881 "name": "pt1", 00:12:02.881 "base_bdev_name": "malloc1" 00:12:02.881 } 00:12:02.881 } 00:12:02.881 }' 00:12:02.881 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.140 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.399 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.399 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.399 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.399 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:03.399 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.399 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.399 "name": "pt2", 00:12:03.399 "aliases": [ 00:12:03.399 "00000000-0000-0000-0000-000000000002" 00:12:03.399 ], 00:12:03.399 "product_name": "passthru", 00:12:03.399 "block_size": 512, 00:12:03.399 "num_blocks": 65536, 00:12:03.399 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:03.399 "assigned_rate_limits": { 00:12:03.399 "rw_ios_per_sec": 0, 00:12:03.399 "rw_mbytes_per_sec": 0, 00:12:03.399 "r_mbytes_per_sec": 0, 00:12:03.399 "w_mbytes_per_sec": 0 00:12:03.399 }, 00:12:03.399 "claimed": true, 00:12:03.399 "claim_type": "exclusive_write", 00:12:03.399 "zoned": false, 00:12:03.399 "supported_io_types": { 00:12:03.399 "read": true, 00:12:03.399 "write": true, 00:12:03.399 "unmap": true, 00:12:03.399 "flush": true, 00:12:03.399 "reset": true, 00:12:03.399 "nvme_admin": false, 00:12:03.399 "nvme_io": false, 00:12:03.399 "nvme_io_md": false, 00:12:03.399 "write_zeroes": true, 00:12:03.399 "zcopy": true, 00:12:03.399 "get_zone_info": false, 00:12:03.399 "zone_management": false, 00:12:03.399 "zone_append": false, 00:12:03.399 "compare": false, 00:12:03.399 "compare_and_write": false, 00:12:03.399 "abort": true, 00:12:03.399 "seek_hole": false, 00:12:03.399 "seek_data": false, 00:12:03.399 "copy": true, 00:12:03.399 "nvme_iov_md": false 00:12:03.399 }, 00:12:03.399 "memory_domains": [ 00:12:03.399 { 00:12:03.399 "dma_device_id": "system", 00:12:03.399 "dma_device_type": 1 00:12:03.399 }, 00:12:03.399 { 00:12:03.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.399 "dma_device_type": 2 00:12:03.399 } 00:12:03.399 ], 00:12:03.399 "driver_specific": { 00:12:03.399 "passthru": { 00:12:03.399 "name": "pt2", 00:12:03.399 "base_bdev_name": "malloc2" 00:12:03.399 } 00:12:03.399 } 00:12:03.399 }' 00:12:03.399 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.658 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.917 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.917 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.917 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:03.917 10:19:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:04.176 [2024-07-15 10:19:41.120571] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:04.176 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a8e26685-59e8-436a-8a2d-0b2218a6e3a0 00:12:04.176 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a8e26685-59e8-436a-8a2d-0b2218a6e3a0 ']' 00:12:04.176 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:04.176 [2024-07-15 10:19:41.368993] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:04.176 [2024-07-15 10:19:41.369015] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:04.176 [2024-07-15 10:19:41.369071] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:04.176 [2024-07-15 10:19:41.369118] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:04.176 [2024-07-15 10:19:41.369138] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e30270 name raid_bdev1, state offline 00:12:04.435 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.435 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:04.695 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:04.695 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:04.695 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:04.695 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:04.695 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:04.695 10:19:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:04.954 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:04.954 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:05.213 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:05.472 [2024-07-15 10:19:42.600200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:05.472 [2024-07-15 10:19:42.601550] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:05.472 [2024-07-15 10:19:42.601607] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:05.472 [2024-07-15 10:19:42.601648] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:05.472 [2024-07-15 10:19:42.601667] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:05.472 [2024-07-15 10:19:42.601682] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e2fff0 name raid_bdev1, state configuring 00:12:05.472 request: 00:12:05.472 { 00:12:05.472 "name": "raid_bdev1", 00:12:05.472 "raid_level": "raid0", 00:12:05.472 "base_bdevs": [ 00:12:05.472 "malloc1", 00:12:05.472 "malloc2" 00:12:05.472 ], 00:12:05.472 "strip_size_kb": 64, 00:12:05.472 "superblock": false, 00:12:05.472 "method": "bdev_raid_create", 00:12:05.472 "req_id": 1 00:12:05.472 } 00:12:05.472 Got JSON-RPC error response 00:12:05.472 response: 00:12:05.472 { 00:12:05.472 "code": -17, 00:12:05.472 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:05.472 } 00:12:05.472 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:05.472 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:05.472 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:05.472 10:19:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:05.472 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.472 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:05.731 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:05.731 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:05.731 10:19:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:05.990 [2024-07-15 10:19:43.081401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:05.990 [2024-07-15 10:19:43.081449] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:05.990 [2024-07-15 10:19:43.081470] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c8c7a0 00:12:05.990 [2024-07-15 10:19:43.081482] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:05.990 [2024-07-15 10:19:43.083120] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:05.990 [2024-07-15 10:19:43.083149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:05.990 [2024-07-15 10:19:43.083218] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:05.990 [2024-07-15 10:19:43.083246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:05.990 pt1 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:05.990 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.991 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:06.248 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.248 "name": "raid_bdev1", 00:12:06.248 "uuid": "a8e26685-59e8-436a-8a2d-0b2218a6e3a0", 00:12:06.248 "strip_size_kb": 64, 00:12:06.248 "state": "configuring", 00:12:06.248 "raid_level": "raid0", 00:12:06.248 "superblock": true, 00:12:06.249 "num_base_bdevs": 2, 00:12:06.249 "num_base_bdevs_discovered": 1, 00:12:06.249 "num_base_bdevs_operational": 2, 00:12:06.249 "base_bdevs_list": [ 00:12:06.249 { 00:12:06.249 "name": "pt1", 00:12:06.249 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:06.249 "is_configured": true, 00:12:06.249 "data_offset": 2048, 00:12:06.249 "data_size": 63488 00:12:06.249 }, 00:12:06.249 { 00:12:06.249 "name": null, 00:12:06.249 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:06.249 "is_configured": false, 00:12:06.249 "data_offset": 2048, 00:12:06.249 "data_size": 63488 00:12:06.249 } 00:12:06.249 ] 00:12:06.249 }' 00:12:06.249 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.249 10:19:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:06.814 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:06.814 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:06.814 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:06.814 10:19:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:07.073 [2024-07-15 10:19:44.160261] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:07.073 [2024-07-15 10:19:44.160314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:07.073 [2024-07-15 10:19:44.160332] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e26820 00:12:07.073 [2024-07-15 10:19:44.160344] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:07.073 [2024-07-15 10:19:44.160702] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:07.073 [2024-07-15 10:19:44.160721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:07.073 [2024-07-15 10:19:44.160786] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:07.073 [2024-07-15 10:19:44.160805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:07.073 [2024-07-15 10:19:44.160903] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c82ec0 00:12:07.073 [2024-07-15 10:19:44.160913] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:07.073 [2024-07-15 10:19:44.161093] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c85530 00:12:07.073 [2024-07-15 10:19:44.161214] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c82ec0 00:12:07.073 [2024-07-15 10:19:44.161224] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c82ec0 00:12:07.073 [2024-07-15 10:19:44.161322] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:07.073 pt2 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.073 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:07.333 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.333 "name": "raid_bdev1", 00:12:07.333 "uuid": "a8e26685-59e8-436a-8a2d-0b2218a6e3a0", 00:12:07.333 "strip_size_kb": 64, 00:12:07.333 "state": "online", 00:12:07.333 "raid_level": "raid0", 00:12:07.333 "superblock": true, 00:12:07.333 "num_base_bdevs": 2, 00:12:07.333 "num_base_bdevs_discovered": 2, 00:12:07.333 "num_base_bdevs_operational": 2, 00:12:07.333 "base_bdevs_list": [ 00:12:07.333 { 00:12:07.333 "name": "pt1", 00:12:07.333 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:07.333 "is_configured": true, 00:12:07.333 "data_offset": 2048, 00:12:07.333 "data_size": 63488 00:12:07.333 }, 00:12:07.333 { 00:12:07.333 "name": "pt2", 00:12:07.333 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:07.333 "is_configured": true, 00:12:07.333 "data_offset": 2048, 00:12:07.333 "data_size": 63488 00:12:07.333 } 00:12:07.333 ] 00:12:07.333 }' 00:12:07.333 10:19:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.333 10:19:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:07.901 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:08.160 [2024-07-15 10:19:45.251422] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:08.160 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:08.160 "name": "raid_bdev1", 00:12:08.160 "aliases": [ 00:12:08.160 "a8e26685-59e8-436a-8a2d-0b2218a6e3a0" 00:12:08.160 ], 00:12:08.160 "product_name": "Raid Volume", 00:12:08.160 "block_size": 512, 00:12:08.160 "num_blocks": 126976, 00:12:08.160 "uuid": "a8e26685-59e8-436a-8a2d-0b2218a6e3a0", 00:12:08.160 "assigned_rate_limits": { 00:12:08.160 "rw_ios_per_sec": 0, 00:12:08.160 "rw_mbytes_per_sec": 0, 00:12:08.160 "r_mbytes_per_sec": 0, 00:12:08.160 "w_mbytes_per_sec": 0 00:12:08.160 }, 00:12:08.160 "claimed": false, 00:12:08.160 "zoned": false, 00:12:08.160 "supported_io_types": { 00:12:08.160 "read": true, 00:12:08.160 "write": true, 00:12:08.160 "unmap": true, 00:12:08.160 "flush": true, 00:12:08.160 "reset": true, 00:12:08.160 "nvme_admin": false, 00:12:08.160 "nvme_io": false, 00:12:08.160 "nvme_io_md": false, 00:12:08.160 "write_zeroes": true, 00:12:08.160 "zcopy": false, 00:12:08.160 "get_zone_info": false, 00:12:08.160 "zone_management": false, 00:12:08.160 "zone_append": false, 00:12:08.160 "compare": false, 00:12:08.160 "compare_and_write": false, 00:12:08.160 "abort": false, 00:12:08.160 "seek_hole": false, 00:12:08.160 "seek_data": false, 00:12:08.160 "copy": false, 00:12:08.160 "nvme_iov_md": false 00:12:08.160 }, 00:12:08.160 "memory_domains": [ 00:12:08.160 { 00:12:08.160 "dma_device_id": "system", 00:12:08.160 "dma_device_type": 1 00:12:08.160 }, 00:12:08.160 { 00:12:08.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.160 "dma_device_type": 2 00:12:08.160 }, 00:12:08.160 { 00:12:08.160 "dma_device_id": "system", 00:12:08.160 "dma_device_type": 1 00:12:08.160 }, 00:12:08.160 { 00:12:08.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.160 "dma_device_type": 2 00:12:08.160 } 00:12:08.160 ], 00:12:08.160 "driver_specific": { 00:12:08.160 "raid": { 00:12:08.160 "uuid": "a8e26685-59e8-436a-8a2d-0b2218a6e3a0", 00:12:08.160 "strip_size_kb": 64, 00:12:08.160 "state": "online", 00:12:08.160 "raid_level": "raid0", 00:12:08.160 "superblock": true, 00:12:08.160 "num_base_bdevs": 2, 00:12:08.160 "num_base_bdevs_discovered": 2, 00:12:08.160 "num_base_bdevs_operational": 2, 00:12:08.160 "base_bdevs_list": [ 00:12:08.160 { 00:12:08.160 "name": "pt1", 00:12:08.160 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:08.160 "is_configured": true, 00:12:08.160 "data_offset": 2048, 00:12:08.160 "data_size": 63488 00:12:08.160 }, 00:12:08.160 { 00:12:08.160 "name": "pt2", 00:12:08.160 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:08.160 "is_configured": true, 00:12:08.160 "data_offset": 2048, 00:12:08.160 "data_size": 63488 00:12:08.160 } 00:12:08.160 ] 00:12:08.160 } 00:12:08.160 } 00:12:08.160 }' 00:12:08.160 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:08.160 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:08.160 pt2' 00:12:08.160 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.160 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.160 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:08.419 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.419 "name": "pt1", 00:12:08.419 "aliases": [ 00:12:08.419 "00000000-0000-0000-0000-000000000001" 00:12:08.419 ], 00:12:08.419 "product_name": "passthru", 00:12:08.419 "block_size": 512, 00:12:08.419 "num_blocks": 65536, 00:12:08.419 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:08.419 "assigned_rate_limits": { 00:12:08.419 "rw_ios_per_sec": 0, 00:12:08.419 "rw_mbytes_per_sec": 0, 00:12:08.419 "r_mbytes_per_sec": 0, 00:12:08.419 "w_mbytes_per_sec": 0 00:12:08.419 }, 00:12:08.419 "claimed": true, 00:12:08.419 "claim_type": "exclusive_write", 00:12:08.419 "zoned": false, 00:12:08.419 "supported_io_types": { 00:12:08.419 "read": true, 00:12:08.419 "write": true, 00:12:08.419 "unmap": true, 00:12:08.419 "flush": true, 00:12:08.419 "reset": true, 00:12:08.419 "nvme_admin": false, 00:12:08.419 "nvme_io": false, 00:12:08.419 "nvme_io_md": false, 00:12:08.419 "write_zeroes": true, 00:12:08.419 "zcopy": true, 00:12:08.419 "get_zone_info": false, 00:12:08.419 "zone_management": false, 00:12:08.419 "zone_append": false, 00:12:08.419 "compare": false, 00:12:08.419 "compare_and_write": false, 00:12:08.419 "abort": true, 00:12:08.419 "seek_hole": false, 00:12:08.419 "seek_data": false, 00:12:08.419 "copy": true, 00:12:08.419 "nvme_iov_md": false 00:12:08.419 }, 00:12:08.419 "memory_domains": [ 00:12:08.419 { 00:12:08.419 "dma_device_id": "system", 00:12:08.419 "dma_device_type": 1 00:12:08.419 }, 00:12:08.419 { 00:12:08.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.419 "dma_device_type": 2 00:12:08.419 } 00:12:08.419 ], 00:12:08.419 "driver_specific": { 00:12:08.419 "passthru": { 00:12:08.419 "name": "pt1", 00:12:08.419 "base_bdev_name": "malloc1" 00:12:08.419 } 00:12:08.419 } 00:12:08.419 }' 00:12:08.419 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.419 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.677 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.936 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:08.936 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.936 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:08.936 10:19:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.936 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.936 "name": "pt2", 00:12:08.936 "aliases": [ 00:12:08.936 "00000000-0000-0000-0000-000000000002" 00:12:08.936 ], 00:12:08.937 "product_name": "passthru", 00:12:08.937 "block_size": 512, 00:12:08.937 "num_blocks": 65536, 00:12:08.937 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:08.937 "assigned_rate_limits": { 00:12:08.937 "rw_ios_per_sec": 0, 00:12:08.937 "rw_mbytes_per_sec": 0, 00:12:08.937 "r_mbytes_per_sec": 0, 00:12:08.937 "w_mbytes_per_sec": 0 00:12:08.937 }, 00:12:08.937 "claimed": true, 00:12:08.937 "claim_type": "exclusive_write", 00:12:08.937 "zoned": false, 00:12:08.937 "supported_io_types": { 00:12:08.937 "read": true, 00:12:08.937 "write": true, 00:12:08.937 "unmap": true, 00:12:08.937 "flush": true, 00:12:08.937 "reset": true, 00:12:08.937 "nvme_admin": false, 00:12:08.937 "nvme_io": false, 00:12:08.937 "nvme_io_md": false, 00:12:08.937 "write_zeroes": true, 00:12:08.937 "zcopy": true, 00:12:08.937 "get_zone_info": false, 00:12:08.937 "zone_management": false, 00:12:08.937 "zone_append": false, 00:12:08.937 "compare": false, 00:12:08.937 "compare_and_write": false, 00:12:08.937 "abort": true, 00:12:08.937 "seek_hole": false, 00:12:08.937 "seek_data": false, 00:12:08.937 "copy": true, 00:12:08.937 "nvme_iov_md": false 00:12:08.937 }, 00:12:08.937 "memory_domains": [ 00:12:08.937 { 00:12:08.937 "dma_device_id": "system", 00:12:08.937 "dma_device_type": 1 00:12:08.937 }, 00:12:08.937 { 00:12:08.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.937 "dma_device_type": 2 00:12:08.937 } 00:12:08.937 ], 00:12:08.937 "driver_specific": { 00:12:08.937 "passthru": { 00:12:08.937 "name": "pt2", 00:12:08.937 "base_bdev_name": "malloc2" 00:12:08.937 } 00:12:08.937 } 00:12:08.937 }' 00:12:08.937 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.195 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:09.195 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:09.196 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.196 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.196 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:09.196 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.196 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.196 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:09.196 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.454 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.454 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:09.454 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:09.454 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:09.713 [2024-07-15 10:19:46.679404] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a8e26685-59e8-436a-8a2d-0b2218a6e3a0 '!=' a8e26685-59e8-436a-8a2d-0b2218a6e3a0 ']' 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 476707 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 476707 ']' 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 476707 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 476707 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 476707' 00:12:09.713 killing process with pid 476707 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 476707 00:12:09.713 [2024-07-15 10:19:46.733745] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:09.713 [2024-07-15 10:19:46.733804] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:09.713 [2024-07-15 10:19:46.733849] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:09.713 [2024-07-15 10:19:46.733861] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c82ec0 name raid_bdev1, state offline 00:12:09.713 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 476707 00:12:09.713 [2024-07-15 10:19:46.753097] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:09.972 10:19:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:09.972 00:12:09.972 real 0m10.480s 00:12:09.972 user 0m18.686s 00:12:09.972 sys 0m1.936s 00:12:09.972 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:09.972 10:19:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.972 ************************************ 00:12:09.972 END TEST raid_superblock_test 00:12:09.972 ************************************ 00:12:09.972 10:19:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:09.972 10:19:47 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:12:09.972 10:19:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:09.972 10:19:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:09.972 10:19:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:09.972 ************************************ 00:12:09.972 START TEST raid_read_error_test 00:12:09.972 ************************************ 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.EzApsiiNH2 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=478335 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 478335 /var/tmp/spdk-raid.sock 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 478335 ']' 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:09.972 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:09.972 10:19:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.972 [2024-07-15 10:19:47.111228] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:12:09.972 [2024-07-15 10:19:47.111295] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478335 ] 00:12:10.230 [2024-07-15 10:19:47.228544] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:10.230 [2024-07-15 10:19:47.330899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:10.230 [2024-07-15 10:19:47.397879] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:10.230 [2024-07-15 10:19:47.397916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:10.879 10:19:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:10.879 10:19:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:10.879 10:19:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:10.880 10:19:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:11.138 BaseBdev1_malloc 00:12:11.138 10:19:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:11.396 true 00:12:11.396 10:19:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:11.654 [2024-07-15 10:19:48.760730] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:11.654 [2024-07-15 10:19:48.760775] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:11.654 [2024-07-15 10:19:48.760797] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27720d0 00:12:11.654 [2024-07-15 10:19:48.760809] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:11.654 [2024-07-15 10:19:48.762667] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:11.654 [2024-07-15 10:19:48.762697] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:11.654 BaseBdev1 00:12:11.654 10:19:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:11.654 10:19:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:11.913 BaseBdev2_malloc 00:12:11.913 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:12.172 true 00:12:12.172 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:12.431 [2024-07-15 10:19:49.496583] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:12.431 [2024-07-15 10:19:49.496628] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:12.431 [2024-07-15 10:19:49.496652] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2776910 00:12:12.431 [2024-07-15 10:19:49.496664] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:12.431 [2024-07-15 10:19:49.498272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:12.431 [2024-07-15 10:19:49.498299] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:12.431 BaseBdev2 00:12:12.431 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:12.690 [2024-07-15 10:19:49.741258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:12.690 [2024-07-15 10:19:49.742644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:12.690 [2024-07-15 10:19:49.742840] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2778320 00:12:12.690 [2024-07-15 10:19:49.742853] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:12.690 [2024-07-15 10:19:49.743063] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2777270 00:12:12.690 [2024-07-15 10:19:49.743214] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2778320 00:12:12.690 [2024-07-15 10:19:49.743224] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2778320 00:12:12.690 [2024-07-15 10:19:49.743334] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:12.690 10:19:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.947 10:19:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.947 "name": "raid_bdev1", 00:12:12.947 "uuid": "b53f9fb1-7d09-472c-8c5f-125c65e9bbe7", 00:12:12.947 "strip_size_kb": 64, 00:12:12.947 "state": "online", 00:12:12.947 "raid_level": "raid0", 00:12:12.947 "superblock": true, 00:12:12.947 "num_base_bdevs": 2, 00:12:12.947 "num_base_bdevs_discovered": 2, 00:12:12.947 "num_base_bdevs_operational": 2, 00:12:12.947 "base_bdevs_list": [ 00:12:12.947 { 00:12:12.947 "name": "BaseBdev1", 00:12:12.948 "uuid": "7111c03d-9c5f-594c-913a-bf60ec5c9ab7", 00:12:12.948 "is_configured": true, 00:12:12.948 "data_offset": 2048, 00:12:12.948 "data_size": 63488 00:12:12.948 }, 00:12:12.948 { 00:12:12.948 "name": "BaseBdev2", 00:12:12.948 "uuid": "03e1b574-63d3-533b-9b45-8babbbe18bb4", 00:12:12.948 "is_configured": true, 00:12:12.948 "data_offset": 2048, 00:12:12.948 "data_size": 63488 00:12:12.948 } 00:12:12.948 ] 00:12:12.948 }' 00:12:12.948 10:19:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.948 10:19:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.514 10:19:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:13.514 10:19:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:13.515 [2024-07-15 10:19:50.684110] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27739b0 00:12:14.447 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.705 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:14.963 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.963 "name": "raid_bdev1", 00:12:14.963 "uuid": "b53f9fb1-7d09-472c-8c5f-125c65e9bbe7", 00:12:14.963 "strip_size_kb": 64, 00:12:14.963 "state": "online", 00:12:14.963 "raid_level": "raid0", 00:12:14.963 "superblock": true, 00:12:14.963 "num_base_bdevs": 2, 00:12:14.963 "num_base_bdevs_discovered": 2, 00:12:14.963 "num_base_bdevs_operational": 2, 00:12:14.963 "base_bdevs_list": [ 00:12:14.963 { 00:12:14.963 "name": "BaseBdev1", 00:12:14.963 "uuid": "7111c03d-9c5f-594c-913a-bf60ec5c9ab7", 00:12:14.963 "is_configured": true, 00:12:14.963 "data_offset": 2048, 00:12:14.963 "data_size": 63488 00:12:14.963 }, 00:12:14.963 { 00:12:14.963 "name": "BaseBdev2", 00:12:14.963 "uuid": "03e1b574-63d3-533b-9b45-8babbbe18bb4", 00:12:14.963 "is_configured": true, 00:12:14.963 "data_offset": 2048, 00:12:14.963 "data_size": 63488 00:12:14.963 } 00:12:14.963 ] 00:12:14.963 }' 00:12:14.963 10:19:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.963 10:19:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.539 10:19:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:15.539 [2024-07-15 10:19:52.715503] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:15.539 [2024-07-15 10:19:52.715545] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:15.539 [2024-07-15 10:19:52.718709] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:15.539 [2024-07-15 10:19:52.718740] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:15.539 [2024-07-15 10:19:52.718768] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:15.539 [2024-07-15 10:19:52.718780] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2778320 name raid_bdev1, state offline 00:12:15.539 0 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 478335 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 478335 ']' 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 478335 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 478335 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 478335' 00:12:15.797 killing process with pid 478335 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 478335 00:12:15.797 [2024-07-15 10:19:52.785657] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:15.797 10:19:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 478335 00:12:15.797 [2024-07-15 10:19:52.796189] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.EzApsiiNH2 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:12:16.055 00:12:16.055 real 0m5.990s 00:12:16.055 user 0m9.389s 00:12:16.055 sys 0m1.062s 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:16.055 10:19:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.055 ************************************ 00:12:16.055 END TEST raid_read_error_test 00:12:16.055 ************************************ 00:12:16.055 10:19:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:16.055 10:19:53 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:12:16.055 10:19:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:16.055 10:19:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:16.055 10:19:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:16.055 ************************************ 00:12:16.055 START TEST raid_write_error_test 00:12:16.055 ************************************ 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:16.055 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.0KAHXVHI42 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=479198 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 479198 /var/tmp/spdk-raid.sock 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 479198 ']' 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:16.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:16.056 10:19:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.056 [2024-07-15 10:19:53.183162] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:12:16.056 [2024-07-15 10:19:53.183230] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479198 ] 00:12:16.315 [2024-07-15 10:19:53.313186] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:16.315 [2024-07-15 10:19:53.421027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:16.315 [2024-07-15 10:19:53.491444] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:16.315 [2024-07-15 10:19:53.491477] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:17.254 10:19:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:17.254 10:19:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:17.254 10:19:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:17.254 10:19:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:17.254 BaseBdev1_malloc 00:12:17.254 10:19:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:17.512 true 00:12:17.512 10:19:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:17.770 [2024-07-15 10:19:54.834501] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:17.770 [2024-07-15 10:19:54.834546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:17.770 [2024-07-15 10:19:54.834566] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d910d0 00:12:17.770 [2024-07-15 10:19:54.834579] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:17.770 [2024-07-15 10:19:54.836442] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:17.770 [2024-07-15 10:19:54.836470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:17.770 BaseBdev1 00:12:17.770 10:19:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:17.770 10:19:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:18.029 BaseBdev2_malloc 00:12:18.029 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:18.288 true 00:12:18.288 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:18.547 [2024-07-15 10:19:55.578267] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:18.547 [2024-07-15 10:19:55.578313] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:18.547 [2024-07-15 10:19:55.578333] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d95910 00:12:18.547 [2024-07-15 10:19:55.578346] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:18.547 [2024-07-15 10:19:55.579968] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:18.547 [2024-07-15 10:19:55.579997] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:18.547 BaseBdev2 00:12:18.547 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:18.806 [2024-07-15 10:19:55.818942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:18.806 [2024-07-15 10:19:55.820318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:18.806 [2024-07-15 10:19:55.820510] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d97320 00:12:18.806 [2024-07-15 10:19:55.820525] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:18.806 [2024-07-15 10:19:55.820721] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d96270 00:12:18.806 [2024-07-15 10:19:55.820869] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d97320 00:12:18.806 [2024-07-15 10:19:55.820879] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d97320 00:12:18.806 [2024-07-15 10:19:55.820999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:18.806 10:19:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.068 10:19:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.068 "name": "raid_bdev1", 00:12:19.068 "uuid": "6b886b81-92c9-4da6-8bdd-04a8a83fc338", 00:12:19.068 "strip_size_kb": 64, 00:12:19.068 "state": "online", 00:12:19.068 "raid_level": "raid0", 00:12:19.068 "superblock": true, 00:12:19.068 "num_base_bdevs": 2, 00:12:19.068 "num_base_bdevs_discovered": 2, 00:12:19.068 "num_base_bdevs_operational": 2, 00:12:19.068 "base_bdevs_list": [ 00:12:19.068 { 00:12:19.068 "name": "BaseBdev1", 00:12:19.068 "uuid": "36110b4a-90ff-5850-9fc7-bb048cd4da25", 00:12:19.068 "is_configured": true, 00:12:19.068 "data_offset": 2048, 00:12:19.068 "data_size": 63488 00:12:19.068 }, 00:12:19.068 { 00:12:19.068 "name": "BaseBdev2", 00:12:19.068 "uuid": "65f74b8e-b5bc-5026-b805-e04fd281ba89", 00:12:19.068 "is_configured": true, 00:12:19.068 "data_offset": 2048, 00:12:19.068 "data_size": 63488 00:12:19.068 } 00:12:19.068 ] 00:12:19.068 }' 00:12:19.068 10:19:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.068 10:19:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.011 10:19:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:20.011 10:19:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:20.011 [2024-07-15 10:19:57.050458] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d929b0 00:12:20.947 10:19:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.205 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:21.464 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.464 "name": "raid_bdev1", 00:12:21.464 "uuid": "6b886b81-92c9-4da6-8bdd-04a8a83fc338", 00:12:21.464 "strip_size_kb": 64, 00:12:21.464 "state": "online", 00:12:21.464 "raid_level": "raid0", 00:12:21.464 "superblock": true, 00:12:21.464 "num_base_bdevs": 2, 00:12:21.464 "num_base_bdevs_discovered": 2, 00:12:21.464 "num_base_bdevs_operational": 2, 00:12:21.464 "base_bdevs_list": [ 00:12:21.464 { 00:12:21.464 "name": "BaseBdev1", 00:12:21.464 "uuid": "36110b4a-90ff-5850-9fc7-bb048cd4da25", 00:12:21.464 "is_configured": true, 00:12:21.464 "data_offset": 2048, 00:12:21.464 "data_size": 63488 00:12:21.464 }, 00:12:21.464 { 00:12:21.464 "name": "BaseBdev2", 00:12:21.464 "uuid": "65f74b8e-b5bc-5026-b805-e04fd281ba89", 00:12:21.464 "is_configured": true, 00:12:21.464 "data_offset": 2048, 00:12:21.464 "data_size": 63488 00:12:21.464 } 00:12:21.464 ] 00:12:21.464 }' 00:12:21.464 10:19:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.464 10:19:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.402 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:22.402 [2024-07-15 10:19:59.537952] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:22.402 [2024-07-15 10:19:59.537994] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:22.402 [2024-07-15 10:19:59.541150] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:22.402 [2024-07-15 10:19:59.541179] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:22.402 [2024-07-15 10:19:59.541208] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:22.402 [2024-07-15 10:19:59.541220] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d97320 name raid_bdev1, state offline 00:12:22.402 0 00:12:22.402 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 479198 00:12:22.402 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 479198 ']' 00:12:22.402 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 479198 00:12:22.402 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:22.402 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:22.402 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 479198 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 479198' 00:12:22.661 killing process with pid 479198 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 479198 00:12:22.661 [2024-07-15 10:19:59.604024] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 479198 00:12:22.661 [2024-07-15 10:19:59.614651] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.0KAHXVHI42 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.40 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.40 != \0\.\0\0 ]] 00:12:22.661 00:12:22.661 real 0m6.741s 00:12:22.661 user 0m10.672s 00:12:22.661 sys 0m1.163s 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:22.661 10:19:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.661 ************************************ 00:12:22.661 END TEST raid_write_error_test 00:12:22.661 ************************************ 00:12:22.920 10:19:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:22.920 10:19:59 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:22.920 10:19:59 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:12:22.920 10:19:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:22.920 10:19:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:22.920 10:19:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:22.920 ************************************ 00:12:22.920 START TEST raid_state_function_test 00:12:22.920 ************************************ 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=480189 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 480189' 00:12:22.920 Process raid pid: 480189 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 480189 /var/tmp/spdk-raid.sock 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 480189 ']' 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:22.920 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:22.920 10:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.920 [2024-07-15 10:20:00.008003] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:12:22.920 [2024-07-15 10:20:00.008075] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:23.179 [2024-07-15 10:20:00.142151] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:23.179 [2024-07-15 10:20:00.238855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:23.179 [2024-07-15 10:20:00.307093] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:23.179 [2024-07-15 10:20:00.307128] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:23.748 10:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:23.748 10:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:23.748 10:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:24.006 [2024-07-15 10:20:01.163615] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:24.006 [2024-07-15 10:20:01.163660] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:24.006 [2024-07-15 10:20:01.163671] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:24.006 [2024-07-15 10:20:01.163683] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.006 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.265 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.265 "name": "Existed_Raid", 00:12:24.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.265 "strip_size_kb": 64, 00:12:24.265 "state": "configuring", 00:12:24.265 "raid_level": "concat", 00:12:24.265 "superblock": false, 00:12:24.265 "num_base_bdevs": 2, 00:12:24.265 "num_base_bdevs_discovered": 0, 00:12:24.265 "num_base_bdevs_operational": 2, 00:12:24.265 "base_bdevs_list": [ 00:12:24.265 { 00:12:24.265 "name": "BaseBdev1", 00:12:24.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.265 "is_configured": false, 00:12:24.265 "data_offset": 0, 00:12:24.265 "data_size": 0 00:12:24.265 }, 00:12:24.265 { 00:12:24.265 "name": "BaseBdev2", 00:12:24.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.265 "is_configured": false, 00:12:24.265 "data_offset": 0, 00:12:24.265 "data_size": 0 00:12:24.265 } 00:12:24.265 ] 00:12:24.265 }' 00:12:24.265 10:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.265 10:20:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.832 10:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:25.091 [2024-07-15 10:20:02.246341] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:25.091 [2024-07-15 10:20:02.246375] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f3aa80 name Existed_Raid, state configuring 00:12:25.091 10:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:25.351 [2024-07-15 10:20:02.495015] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:25.351 [2024-07-15 10:20:02.495045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:25.351 [2024-07-15 10:20:02.495055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:25.351 [2024-07-15 10:20:02.495067] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:25.351 10:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:25.652 [2024-07-15 10:20:02.689293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:25.652 BaseBdev1 00:12:25.652 10:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:25.652 10:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:25.652 10:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:25.652 10:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:25.652 10:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:25.652 10:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:25.652 10:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:25.941 10:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:25.941 [ 00:12:25.941 { 00:12:25.941 "name": "BaseBdev1", 00:12:25.941 "aliases": [ 00:12:25.941 "8acfaed2-4c15-4b4a-aa96-f532d274930b" 00:12:25.941 ], 00:12:25.941 "product_name": "Malloc disk", 00:12:25.941 "block_size": 512, 00:12:25.941 "num_blocks": 65536, 00:12:25.941 "uuid": "8acfaed2-4c15-4b4a-aa96-f532d274930b", 00:12:25.941 "assigned_rate_limits": { 00:12:25.941 "rw_ios_per_sec": 0, 00:12:25.941 "rw_mbytes_per_sec": 0, 00:12:25.941 "r_mbytes_per_sec": 0, 00:12:25.941 "w_mbytes_per_sec": 0 00:12:25.941 }, 00:12:25.941 "claimed": true, 00:12:25.941 "claim_type": "exclusive_write", 00:12:25.941 "zoned": false, 00:12:25.941 "supported_io_types": { 00:12:25.941 "read": true, 00:12:25.941 "write": true, 00:12:25.941 "unmap": true, 00:12:25.941 "flush": true, 00:12:25.941 "reset": true, 00:12:25.941 "nvme_admin": false, 00:12:25.941 "nvme_io": false, 00:12:25.941 "nvme_io_md": false, 00:12:25.941 "write_zeroes": true, 00:12:25.941 "zcopy": true, 00:12:25.941 "get_zone_info": false, 00:12:25.941 "zone_management": false, 00:12:25.941 "zone_append": false, 00:12:25.941 "compare": false, 00:12:25.941 "compare_and_write": false, 00:12:25.941 "abort": true, 00:12:25.941 "seek_hole": false, 00:12:25.941 "seek_data": false, 00:12:25.941 "copy": true, 00:12:25.941 "nvme_iov_md": false 00:12:25.941 }, 00:12:25.941 "memory_domains": [ 00:12:25.941 { 00:12:25.941 "dma_device_id": "system", 00:12:25.941 "dma_device_type": 1 00:12:25.941 }, 00:12:25.941 { 00:12:25.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:25.941 "dma_device_type": 2 00:12:25.941 } 00:12:25.941 ], 00:12:25.941 "driver_specific": {} 00:12:25.941 } 00:12:25.941 ] 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.941 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.202 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.202 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.202 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.202 "name": "Existed_Raid", 00:12:26.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.202 "strip_size_kb": 64, 00:12:26.202 "state": "configuring", 00:12:26.202 "raid_level": "concat", 00:12:26.202 "superblock": false, 00:12:26.202 "num_base_bdevs": 2, 00:12:26.202 "num_base_bdevs_discovered": 1, 00:12:26.202 "num_base_bdevs_operational": 2, 00:12:26.202 "base_bdevs_list": [ 00:12:26.202 { 00:12:26.202 "name": "BaseBdev1", 00:12:26.202 "uuid": "8acfaed2-4c15-4b4a-aa96-f532d274930b", 00:12:26.202 "is_configured": true, 00:12:26.202 "data_offset": 0, 00:12:26.202 "data_size": 65536 00:12:26.202 }, 00:12:26.202 { 00:12:26.202 "name": "BaseBdev2", 00:12:26.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.202 "is_configured": false, 00:12:26.202 "data_offset": 0, 00:12:26.202 "data_size": 0 00:12:26.202 } 00:12:26.202 ] 00:12:26.202 }' 00:12:26.202 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.202 10:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.768 10:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:27.027 [2024-07-15 10:20:04.113063] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:27.027 [2024-07-15 10:20:04.113109] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f3a350 name Existed_Raid, state configuring 00:12:27.027 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:27.286 [2024-07-15 10:20:04.357737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:27.286 [2024-07-15 10:20:04.359228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:27.286 [2024-07-15 10:20:04.359263] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:27.286 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.545 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:27.545 "name": "Existed_Raid", 00:12:27.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.545 "strip_size_kb": 64, 00:12:27.545 "state": "configuring", 00:12:27.545 "raid_level": "concat", 00:12:27.545 "superblock": false, 00:12:27.545 "num_base_bdevs": 2, 00:12:27.545 "num_base_bdevs_discovered": 1, 00:12:27.545 "num_base_bdevs_operational": 2, 00:12:27.545 "base_bdevs_list": [ 00:12:27.545 { 00:12:27.545 "name": "BaseBdev1", 00:12:27.545 "uuid": "8acfaed2-4c15-4b4a-aa96-f532d274930b", 00:12:27.545 "is_configured": true, 00:12:27.545 "data_offset": 0, 00:12:27.546 "data_size": 65536 00:12:27.546 }, 00:12:27.546 { 00:12:27.546 "name": "BaseBdev2", 00:12:27.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:27.546 "is_configured": false, 00:12:27.546 "data_offset": 0, 00:12:27.546 "data_size": 0 00:12:27.546 } 00:12:27.546 ] 00:12:27.546 }' 00:12:27.546 10:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:27.546 10:20:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.114 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:28.373 [2024-07-15 10:20:05.484156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:28.373 [2024-07-15 10:20:05.484196] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f3b000 00:12:28.373 [2024-07-15 10:20:05.484205] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:28.373 [2024-07-15 10:20:05.484403] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e550c0 00:12:28.373 [2024-07-15 10:20:05.484525] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f3b000 00:12:28.373 [2024-07-15 10:20:05.484535] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f3b000 00:12:28.373 [2024-07-15 10:20:05.484705] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:28.373 BaseBdev2 00:12:28.373 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:28.373 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:28.373 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:28.373 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:28.373 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:28.373 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:28.373 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:28.632 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:28.891 [ 00:12:28.891 { 00:12:28.891 "name": "BaseBdev2", 00:12:28.891 "aliases": [ 00:12:28.891 "854e71ac-3f9c-49a8-8a26-9b2b23d94a64" 00:12:28.891 ], 00:12:28.891 "product_name": "Malloc disk", 00:12:28.891 "block_size": 512, 00:12:28.891 "num_blocks": 65536, 00:12:28.891 "uuid": "854e71ac-3f9c-49a8-8a26-9b2b23d94a64", 00:12:28.891 "assigned_rate_limits": { 00:12:28.891 "rw_ios_per_sec": 0, 00:12:28.891 "rw_mbytes_per_sec": 0, 00:12:28.891 "r_mbytes_per_sec": 0, 00:12:28.891 "w_mbytes_per_sec": 0 00:12:28.891 }, 00:12:28.891 "claimed": true, 00:12:28.891 "claim_type": "exclusive_write", 00:12:28.891 "zoned": false, 00:12:28.891 "supported_io_types": { 00:12:28.891 "read": true, 00:12:28.891 "write": true, 00:12:28.891 "unmap": true, 00:12:28.891 "flush": true, 00:12:28.891 "reset": true, 00:12:28.891 "nvme_admin": false, 00:12:28.891 "nvme_io": false, 00:12:28.891 "nvme_io_md": false, 00:12:28.891 "write_zeroes": true, 00:12:28.891 "zcopy": true, 00:12:28.891 "get_zone_info": false, 00:12:28.891 "zone_management": false, 00:12:28.891 "zone_append": false, 00:12:28.891 "compare": false, 00:12:28.891 "compare_and_write": false, 00:12:28.891 "abort": true, 00:12:28.891 "seek_hole": false, 00:12:28.891 "seek_data": false, 00:12:28.891 "copy": true, 00:12:28.891 "nvme_iov_md": false 00:12:28.891 }, 00:12:28.891 "memory_domains": [ 00:12:28.891 { 00:12:28.891 "dma_device_id": "system", 00:12:28.891 "dma_device_type": 1 00:12:28.891 }, 00:12:28.891 { 00:12:28.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:28.891 "dma_device_type": 2 00:12:28.891 } 00:12:28.891 ], 00:12:28.891 "driver_specific": {} 00:12:28.891 } 00:12:28.891 ] 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.891 10:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.150 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.150 "name": "Existed_Raid", 00:12:29.150 "uuid": "ec31045d-0a8c-4758-97ab-fd0ed6296a20", 00:12:29.150 "strip_size_kb": 64, 00:12:29.150 "state": "online", 00:12:29.150 "raid_level": "concat", 00:12:29.150 "superblock": false, 00:12:29.150 "num_base_bdevs": 2, 00:12:29.150 "num_base_bdevs_discovered": 2, 00:12:29.150 "num_base_bdevs_operational": 2, 00:12:29.150 "base_bdevs_list": [ 00:12:29.150 { 00:12:29.150 "name": "BaseBdev1", 00:12:29.150 "uuid": "8acfaed2-4c15-4b4a-aa96-f532d274930b", 00:12:29.150 "is_configured": true, 00:12:29.150 "data_offset": 0, 00:12:29.150 "data_size": 65536 00:12:29.150 }, 00:12:29.150 { 00:12:29.150 "name": "BaseBdev2", 00:12:29.150 "uuid": "854e71ac-3f9c-49a8-8a26-9b2b23d94a64", 00:12:29.150 "is_configured": true, 00:12:29.150 "data_offset": 0, 00:12:29.150 "data_size": 65536 00:12:29.150 } 00:12:29.150 ] 00:12:29.150 }' 00:12:29.150 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.150 10:20:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.718 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:29.718 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:29.718 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:29.718 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:29.719 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:29.719 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:29.719 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:29.719 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:29.978 [2024-07-15 10:20:06.944301] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:29.978 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:29.978 "name": "Existed_Raid", 00:12:29.978 "aliases": [ 00:12:29.978 "ec31045d-0a8c-4758-97ab-fd0ed6296a20" 00:12:29.978 ], 00:12:29.978 "product_name": "Raid Volume", 00:12:29.978 "block_size": 512, 00:12:29.978 "num_blocks": 131072, 00:12:29.978 "uuid": "ec31045d-0a8c-4758-97ab-fd0ed6296a20", 00:12:29.978 "assigned_rate_limits": { 00:12:29.978 "rw_ios_per_sec": 0, 00:12:29.978 "rw_mbytes_per_sec": 0, 00:12:29.978 "r_mbytes_per_sec": 0, 00:12:29.978 "w_mbytes_per_sec": 0 00:12:29.978 }, 00:12:29.978 "claimed": false, 00:12:29.978 "zoned": false, 00:12:29.978 "supported_io_types": { 00:12:29.978 "read": true, 00:12:29.978 "write": true, 00:12:29.978 "unmap": true, 00:12:29.978 "flush": true, 00:12:29.978 "reset": true, 00:12:29.978 "nvme_admin": false, 00:12:29.978 "nvme_io": false, 00:12:29.978 "nvme_io_md": false, 00:12:29.978 "write_zeroes": true, 00:12:29.978 "zcopy": false, 00:12:29.978 "get_zone_info": false, 00:12:29.978 "zone_management": false, 00:12:29.978 "zone_append": false, 00:12:29.978 "compare": false, 00:12:29.978 "compare_and_write": false, 00:12:29.978 "abort": false, 00:12:29.978 "seek_hole": false, 00:12:29.978 "seek_data": false, 00:12:29.978 "copy": false, 00:12:29.978 "nvme_iov_md": false 00:12:29.978 }, 00:12:29.978 "memory_domains": [ 00:12:29.978 { 00:12:29.978 "dma_device_id": "system", 00:12:29.978 "dma_device_type": 1 00:12:29.978 }, 00:12:29.978 { 00:12:29.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.978 "dma_device_type": 2 00:12:29.978 }, 00:12:29.978 { 00:12:29.978 "dma_device_id": "system", 00:12:29.978 "dma_device_type": 1 00:12:29.978 }, 00:12:29.978 { 00:12:29.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.978 "dma_device_type": 2 00:12:29.978 } 00:12:29.978 ], 00:12:29.978 "driver_specific": { 00:12:29.978 "raid": { 00:12:29.978 "uuid": "ec31045d-0a8c-4758-97ab-fd0ed6296a20", 00:12:29.978 "strip_size_kb": 64, 00:12:29.978 "state": "online", 00:12:29.978 "raid_level": "concat", 00:12:29.978 "superblock": false, 00:12:29.978 "num_base_bdevs": 2, 00:12:29.978 "num_base_bdevs_discovered": 2, 00:12:29.978 "num_base_bdevs_operational": 2, 00:12:29.978 "base_bdevs_list": [ 00:12:29.978 { 00:12:29.978 "name": "BaseBdev1", 00:12:29.978 "uuid": "8acfaed2-4c15-4b4a-aa96-f532d274930b", 00:12:29.978 "is_configured": true, 00:12:29.978 "data_offset": 0, 00:12:29.978 "data_size": 65536 00:12:29.978 }, 00:12:29.978 { 00:12:29.978 "name": "BaseBdev2", 00:12:29.978 "uuid": "854e71ac-3f9c-49a8-8a26-9b2b23d94a64", 00:12:29.978 "is_configured": true, 00:12:29.978 "data_offset": 0, 00:12:29.978 "data_size": 65536 00:12:29.978 } 00:12:29.978 ] 00:12:29.978 } 00:12:29.978 } 00:12:29.978 }' 00:12:29.978 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:29.978 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:29.978 BaseBdev2' 00:12:29.978 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:29.978 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:29.978 10:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:30.238 "name": "BaseBdev1", 00:12:30.238 "aliases": [ 00:12:30.238 "8acfaed2-4c15-4b4a-aa96-f532d274930b" 00:12:30.238 ], 00:12:30.238 "product_name": "Malloc disk", 00:12:30.238 "block_size": 512, 00:12:30.238 "num_blocks": 65536, 00:12:30.238 "uuid": "8acfaed2-4c15-4b4a-aa96-f532d274930b", 00:12:30.238 "assigned_rate_limits": { 00:12:30.238 "rw_ios_per_sec": 0, 00:12:30.238 "rw_mbytes_per_sec": 0, 00:12:30.238 "r_mbytes_per_sec": 0, 00:12:30.238 "w_mbytes_per_sec": 0 00:12:30.238 }, 00:12:30.238 "claimed": true, 00:12:30.238 "claim_type": "exclusive_write", 00:12:30.238 "zoned": false, 00:12:30.238 "supported_io_types": { 00:12:30.238 "read": true, 00:12:30.238 "write": true, 00:12:30.238 "unmap": true, 00:12:30.238 "flush": true, 00:12:30.238 "reset": true, 00:12:30.238 "nvme_admin": false, 00:12:30.238 "nvme_io": false, 00:12:30.238 "nvme_io_md": false, 00:12:30.238 "write_zeroes": true, 00:12:30.238 "zcopy": true, 00:12:30.238 "get_zone_info": false, 00:12:30.238 "zone_management": false, 00:12:30.238 "zone_append": false, 00:12:30.238 "compare": false, 00:12:30.238 "compare_and_write": false, 00:12:30.238 "abort": true, 00:12:30.238 "seek_hole": false, 00:12:30.238 "seek_data": false, 00:12:30.238 "copy": true, 00:12:30.238 "nvme_iov_md": false 00:12:30.238 }, 00:12:30.238 "memory_domains": [ 00:12:30.238 { 00:12:30.238 "dma_device_id": "system", 00:12:30.238 "dma_device_type": 1 00:12:30.238 }, 00:12:30.238 { 00:12:30.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.238 "dma_device_type": 2 00:12:30.238 } 00:12:30.238 ], 00:12:30.238 "driver_specific": {} 00:12:30.238 }' 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:30.238 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:30.497 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:30.497 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:30.497 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:30.497 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:30.497 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:30.497 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:30.497 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:30.757 "name": "BaseBdev2", 00:12:30.757 "aliases": [ 00:12:30.757 "854e71ac-3f9c-49a8-8a26-9b2b23d94a64" 00:12:30.757 ], 00:12:30.757 "product_name": "Malloc disk", 00:12:30.757 "block_size": 512, 00:12:30.757 "num_blocks": 65536, 00:12:30.757 "uuid": "854e71ac-3f9c-49a8-8a26-9b2b23d94a64", 00:12:30.757 "assigned_rate_limits": { 00:12:30.757 "rw_ios_per_sec": 0, 00:12:30.757 "rw_mbytes_per_sec": 0, 00:12:30.757 "r_mbytes_per_sec": 0, 00:12:30.757 "w_mbytes_per_sec": 0 00:12:30.757 }, 00:12:30.757 "claimed": true, 00:12:30.757 "claim_type": "exclusive_write", 00:12:30.757 "zoned": false, 00:12:30.757 "supported_io_types": { 00:12:30.757 "read": true, 00:12:30.757 "write": true, 00:12:30.757 "unmap": true, 00:12:30.757 "flush": true, 00:12:30.757 "reset": true, 00:12:30.757 "nvme_admin": false, 00:12:30.757 "nvme_io": false, 00:12:30.757 "nvme_io_md": false, 00:12:30.757 "write_zeroes": true, 00:12:30.757 "zcopy": true, 00:12:30.757 "get_zone_info": false, 00:12:30.757 "zone_management": false, 00:12:30.757 "zone_append": false, 00:12:30.757 "compare": false, 00:12:30.757 "compare_and_write": false, 00:12:30.757 "abort": true, 00:12:30.757 "seek_hole": false, 00:12:30.757 "seek_data": false, 00:12:30.757 "copy": true, 00:12:30.757 "nvme_iov_md": false 00:12:30.757 }, 00:12:30.757 "memory_domains": [ 00:12:30.757 { 00:12:30.757 "dma_device_id": "system", 00:12:30.757 "dma_device_type": 1 00:12:30.757 }, 00:12:30.757 { 00:12:30.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.757 "dma_device_type": 2 00:12:30.757 } 00:12:30.757 ], 00:12:30.757 "driver_specific": {} 00:12:30.757 }' 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:30.757 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:31.016 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:31.016 10:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.016 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:31.016 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:31.016 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:31.275 [2024-07-15 10:20:08.271627] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:31.275 [2024-07-15 10:20:08.271656] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:31.275 [2024-07-15 10:20:08.271699] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.275 "name": "Existed_Raid", 00:12:31.275 "uuid": "ec31045d-0a8c-4758-97ab-fd0ed6296a20", 00:12:31.275 "strip_size_kb": 64, 00:12:31.275 "state": "offline", 00:12:31.275 "raid_level": "concat", 00:12:31.275 "superblock": false, 00:12:31.275 "num_base_bdevs": 2, 00:12:31.275 "num_base_bdevs_discovered": 1, 00:12:31.275 "num_base_bdevs_operational": 1, 00:12:31.275 "base_bdevs_list": [ 00:12:31.275 { 00:12:31.275 "name": null, 00:12:31.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:31.275 "is_configured": false, 00:12:31.275 "data_offset": 0, 00:12:31.275 "data_size": 65536 00:12:31.275 }, 00:12:31.275 { 00:12:31.275 "name": "BaseBdev2", 00:12:31.275 "uuid": "854e71ac-3f9c-49a8-8a26-9b2b23d94a64", 00:12:31.275 "is_configured": true, 00:12:31.275 "data_offset": 0, 00:12:31.275 "data_size": 65536 00:12:31.275 } 00:12:31.275 ] 00:12:31.275 }' 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.275 10:20:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.843 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:31.843 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:31.843 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.843 10:20:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:32.101 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:32.101 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:32.101 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:32.359 [2024-07-15 10:20:09.447775] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:32.359 [2024-07-15 10:20:09.447831] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f3b000 name Existed_Raid, state offline 00:12:32.359 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:32.359 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:32.359 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.360 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 480189 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 480189 ']' 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 480189 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 480189 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 480189' 00:12:32.618 killing process with pid 480189 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 480189 00:12:32.618 [2024-07-15 10:20:09.789152] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:32.618 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 480189 00:12:32.618 [2024-07-15 10:20:09.790022] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:32.876 10:20:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:32.876 00:12:32.876 real 0m10.061s 00:12:32.876 user 0m17.816s 00:12:32.876 sys 0m1.932s 00:12:32.876 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:32.876 10:20:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.876 ************************************ 00:12:32.876 END TEST raid_state_function_test 00:12:32.876 ************************************ 00:12:32.876 10:20:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:32.876 10:20:10 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:32.876 10:20:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:32.876 10:20:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:32.876 10:20:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:32.876 ************************************ 00:12:32.876 START TEST raid_state_function_test_sb 00:12:32.876 ************************************ 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:32.876 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=481749 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 481749' 00:12:33.135 Process raid pid: 481749 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 481749 /var/tmp/spdk-raid.sock 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 481749 ']' 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:33.135 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:33.135 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:33.135 [2024-07-15 10:20:10.126968] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:12:33.135 [2024-07-15 10:20:10.127032] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:33.135 [2024-07-15 10:20:10.254801] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:33.394 [2024-07-15 10:20:10.361753] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:33.394 [2024-07-15 10:20:10.421460] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:33.394 [2024-07-15 10:20:10.421490] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:33.960 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:33.960 10:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:33.960 10:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:33.960 [2024-07-15 10:20:11.127369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:33.960 [2024-07-15 10:20:11.127412] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:33.960 [2024-07-15 10:20:11.127423] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:33.960 [2024-07-15 10:20:11.127435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:33.960 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:33.960 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.960 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:33.960 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.961 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.541 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.541 "name": "Existed_Raid", 00:12:34.541 "uuid": "a91ddb0f-d167-41f6-b395-fb9a877e40cf", 00:12:34.541 "strip_size_kb": 64, 00:12:34.541 "state": "configuring", 00:12:34.541 "raid_level": "concat", 00:12:34.541 "superblock": true, 00:12:34.541 "num_base_bdevs": 2, 00:12:34.541 "num_base_bdevs_discovered": 0, 00:12:34.541 "num_base_bdevs_operational": 2, 00:12:34.541 "base_bdevs_list": [ 00:12:34.541 { 00:12:34.541 "name": "BaseBdev1", 00:12:34.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.541 "is_configured": false, 00:12:34.541 "data_offset": 0, 00:12:34.541 "data_size": 0 00:12:34.541 }, 00:12:34.541 { 00:12:34.541 "name": "BaseBdev2", 00:12:34.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.541 "is_configured": false, 00:12:34.541 "data_offset": 0, 00:12:34.541 "data_size": 0 00:12:34.541 } 00:12:34.541 ] 00:12:34.541 }' 00:12:34.541 10:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.541 10:20:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:35.105 10:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:35.362 [2024-07-15 10:20:12.450688] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:35.362 [2024-07-15 10:20:12.450720] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb18a80 name Existed_Raid, state configuring 00:12:35.362 10:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:35.619 [2024-07-15 10:20:12.699377] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:35.619 [2024-07-15 10:20:12.699405] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:35.619 [2024-07-15 10:20:12.699415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:35.619 [2024-07-15 10:20:12.699426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:35.619 10:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:35.877 [2024-07-15 10:20:12.949924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:35.877 BaseBdev1 00:12:35.877 10:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:35.877 10:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:35.877 10:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:35.877 10:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:35.877 10:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:35.877 10:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:35.877 10:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:36.135 10:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:36.392 [ 00:12:36.392 { 00:12:36.392 "name": "BaseBdev1", 00:12:36.392 "aliases": [ 00:12:36.392 "7e992478-7d21-4afb-b9cd-f8f456cf7f8e" 00:12:36.392 ], 00:12:36.392 "product_name": "Malloc disk", 00:12:36.392 "block_size": 512, 00:12:36.392 "num_blocks": 65536, 00:12:36.392 "uuid": "7e992478-7d21-4afb-b9cd-f8f456cf7f8e", 00:12:36.392 "assigned_rate_limits": { 00:12:36.392 "rw_ios_per_sec": 0, 00:12:36.392 "rw_mbytes_per_sec": 0, 00:12:36.392 "r_mbytes_per_sec": 0, 00:12:36.392 "w_mbytes_per_sec": 0 00:12:36.392 }, 00:12:36.392 "claimed": true, 00:12:36.392 "claim_type": "exclusive_write", 00:12:36.392 "zoned": false, 00:12:36.392 "supported_io_types": { 00:12:36.392 "read": true, 00:12:36.392 "write": true, 00:12:36.392 "unmap": true, 00:12:36.392 "flush": true, 00:12:36.392 "reset": true, 00:12:36.392 "nvme_admin": false, 00:12:36.392 "nvme_io": false, 00:12:36.392 "nvme_io_md": false, 00:12:36.392 "write_zeroes": true, 00:12:36.392 "zcopy": true, 00:12:36.392 "get_zone_info": false, 00:12:36.392 "zone_management": false, 00:12:36.392 "zone_append": false, 00:12:36.392 "compare": false, 00:12:36.392 "compare_and_write": false, 00:12:36.392 "abort": true, 00:12:36.392 "seek_hole": false, 00:12:36.392 "seek_data": false, 00:12:36.392 "copy": true, 00:12:36.392 "nvme_iov_md": false 00:12:36.392 }, 00:12:36.392 "memory_domains": [ 00:12:36.392 { 00:12:36.392 "dma_device_id": "system", 00:12:36.392 "dma_device_type": 1 00:12:36.392 }, 00:12:36.392 { 00:12:36.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.392 "dma_device_type": 2 00:12:36.392 } 00:12:36.392 ], 00:12:36.392 "driver_specific": {} 00:12:36.392 } 00:12:36.392 ] 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.392 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.649 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.649 "name": "Existed_Raid", 00:12:36.649 "uuid": "6799e808-cd54-47d8-be3b-0ce39439e00b", 00:12:36.649 "strip_size_kb": 64, 00:12:36.649 "state": "configuring", 00:12:36.649 "raid_level": "concat", 00:12:36.649 "superblock": true, 00:12:36.649 "num_base_bdevs": 2, 00:12:36.649 "num_base_bdevs_discovered": 1, 00:12:36.649 "num_base_bdevs_operational": 2, 00:12:36.649 "base_bdevs_list": [ 00:12:36.649 { 00:12:36.649 "name": "BaseBdev1", 00:12:36.649 "uuid": "7e992478-7d21-4afb-b9cd-f8f456cf7f8e", 00:12:36.649 "is_configured": true, 00:12:36.649 "data_offset": 2048, 00:12:36.649 "data_size": 63488 00:12:36.649 }, 00:12:36.649 { 00:12:36.649 "name": "BaseBdev2", 00:12:36.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.649 "is_configured": false, 00:12:36.649 "data_offset": 0, 00:12:36.649 "data_size": 0 00:12:36.649 } 00:12:36.649 ] 00:12:36.649 }' 00:12:36.649 10:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.649 10:20:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:37.215 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:37.473 [2024-07-15 10:20:14.481973] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:37.473 [2024-07-15 10:20:14.482014] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb18350 name Existed_Raid, state configuring 00:12:37.473 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:37.731 [2024-07-15 10:20:14.714627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:37.731 [2024-07-15 10:20:14.716171] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:37.731 [2024-07-15 10:20:14.716203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.731 10:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.308 10:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.308 "name": "Existed_Raid", 00:12:38.308 "uuid": "9ea25f34-cbed-46f1-85fa-0d21b60893c9", 00:12:38.308 "strip_size_kb": 64, 00:12:38.309 "state": "configuring", 00:12:38.309 "raid_level": "concat", 00:12:38.309 "superblock": true, 00:12:38.309 "num_base_bdevs": 2, 00:12:38.309 "num_base_bdevs_discovered": 1, 00:12:38.309 "num_base_bdevs_operational": 2, 00:12:38.309 "base_bdevs_list": [ 00:12:38.309 { 00:12:38.309 "name": "BaseBdev1", 00:12:38.309 "uuid": "7e992478-7d21-4afb-b9cd-f8f456cf7f8e", 00:12:38.309 "is_configured": true, 00:12:38.309 "data_offset": 2048, 00:12:38.309 "data_size": 63488 00:12:38.309 }, 00:12:38.309 { 00:12:38.309 "name": "BaseBdev2", 00:12:38.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.309 "is_configured": false, 00:12:38.309 "data_offset": 0, 00:12:38.309 "data_size": 0 00:12:38.309 } 00:12:38.309 ] 00:12:38.309 }' 00:12:38.309 10:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.309 10:20:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:38.876 10:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:38.876 [2024-07-15 10:20:16.069567] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:38.876 [2024-07-15 10:20:16.069717] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb19000 00:12:38.876 [2024-07-15 10:20:16.069731] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:38.876 [2024-07-15 10:20:16.069899] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa330c0 00:12:38.876 [2024-07-15 10:20:16.070025] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb19000 00:12:38.876 [2024-07-15 10:20:16.070035] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb19000 00:12:38.876 [2024-07-15 10:20:16.070125] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:38.876 BaseBdev2 00:12:39.135 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:39.135 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:39.135 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:39.135 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:39.135 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:39.135 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:39.135 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:39.394 [ 00:12:39.394 { 00:12:39.394 "name": "BaseBdev2", 00:12:39.394 "aliases": [ 00:12:39.394 "6b642c92-46d9-4785-95d5-a218b5e1d5f2" 00:12:39.394 ], 00:12:39.394 "product_name": "Malloc disk", 00:12:39.394 "block_size": 512, 00:12:39.394 "num_blocks": 65536, 00:12:39.394 "uuid": "6b642c92-46d9-4785-95d5-a218b5e1d5f2", 00:12:39.394 "assigned_rate_limits": { 00:12:39.394 "rw_ios_per_sec": 0, 00:12:39.394 "rw_mbytes_per_sec": 0, 00:12:39.394 "r_mbytes_per_sec": 0, 00:12:39.394 "w_mbytes_per_sec": 0 00:12:39.394 }, 00:12:39.394 "claimed": true, 00:12:39.394 "claim_type": "exclusive_write", 00:12:39.394 "zoned": false, 00:12:39.394 "supported_io_types": { 00:12:39.394 "read": true, 00:12:39.394 "write": true, 00:12:39.394 "unmap": true, 00:12:39.394 "flush": true, 00:12:39.394 "reset": true, 00:12:39.394 "nvme_admin": false, 00:12:39.394 "nvme_io": false, 00:12:39.394 "nvme_io_md": false, 00:12:39.394 "write_zeroes": true, 00:12:39.394 "zcopy": true, 00:12:39.394 "get_zone_info": false, 00:12:39.394 "zone_management": false, 00:12:39.394 "zone_append": false, 00:12:39.394 "compare": false, 00:12:39.394 "compare_and_write": false, 00:12:39.394 "abort": true, 00:12:39.394 "seek_hole": false, 00:12:39.394 "seek_data": false, 00:12:39.394 "copy": true, 00:12:39.394 "nvme_iov_md": false 00:12:39.394 }, 00:12:39.394 "memory_domains": [ 00:12:39.394 { 00:12:39.394 "dma_device_id": "system", 00:12:39.394 "dma_device_type": 1 00:12:39.394 }, 00:12:39.394 { 00:12:39.394 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.394 "dma_device_type": 2 00:12:39.394 } 00:12:39.394 ], 00:12:39.394 "driver_specific": {} 00:12:39.394 } 00:12:39.394 ] 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.394 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.395 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.395 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.395 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.395 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.654 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.654 "name": "Existed_Raid", 00:12:39.654 "uuid": "9ea25f34-cbed-46f1-85fa-0d21b60893c9", 00:12:39.654 "strip_size_kb": 64, 00:12:39.654 "state": "online", 00:12:39.654 "raid_level": "concat", 00:12:39.654 "superblock": true, 00:12:39.654 "num_base_bdevs": 2, 00:12:39.654 "num_base_bdevs_discovered": 2, 00:12:39.654 "num_base_bdevs_operational": 2, 00:12:39.654 "base_bdevs_list": [ 00:12:39.654 { 00:12:39.654 "name": "BaseBdev1", 00:12:39.654 "uuid": "7e992478-7d21-4afb-b9cd-f8f456cf7f8e", 00:12:39.654 "is_configured": true, 00:12:39.654 "data_offset": 2048, 00:12:39.654 "data_size": 63488 00:12:39.654 }, 00:12:39.654 { 00:12:39.654 "name": "BaseBdev2", 00:12:39.654 "uuid": "6b642c92-46d9-4785-95d5-a218b5e1d5f2", 00:12:39.654 "is_configured": true, 00:12:39.654 "data_offset": 2048, 00:12:39.654 "data_size": 63488 00:12:39.654 } 00:12:39.654 ] 00:12:39.654 }' 00:12:39.654 10:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.654 10:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:40.256 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:40.515 [2024-07-15 10:20:17.646052] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:40.515 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:40.515 "name": "Existed_Raid", 00:12:40.515 "aliases": [ 00:12:40.515 "9ea25f34-cbed-46f1-85fa-0d21b60893c9" 00:12:40.515 ], 00:12:40.515 "product_name": "Raid Volume", 00:12:40.515 "block_size": 512, 00:12:40.515 "num_blocks": 126976, 00:12:40.515 "uuid": "9ea25f34-cbed-46f1-85fa-0d21b60893c9", 00:12:40.515 "assigned_rate_limits": { 00:12:40.515 "rw_ios_per_sec": 0, 00:12:40.515 "rw_mbytes_per_sec": 0, 00:12:40.515 "r_mbytes_per_sec": 0, 00:12:40.515 "w_mbytes_per_sec": 0 00:12:40.515 }, 00:12:40.515 "claimed": false, 00:12:40.515 "zoned": false, 00:12:40.515 "supported_io_types": { 00:12:40.515 "read": true, 00:12:40.515 "write": true, 00:12:40.515 "unmap": true, 00:12:40.515 "flush": true, 00:12:40.515 "reset": true, 00:12:40.515 "nvme_admin": false, 00:12:40.515 "nvme_io": false, 00:12:40.515 "nvme_io_md": false, 00:12:40.515 "write_zeroes": true, 00:12:40.515 "zcopy": false, 00:12:40.515 "get_zone_info": false, 00:12:40.515 "zone_management": false, 00:12:40.515 "zone_append": false, 00:12:40.515 "compare": false, 00:12:40.515 "compare_and_write": false, 00:12:40.515 "abort": false, 00:12:40.515 "seek_hole": false, 00:12:40.515 "seek_data": false, 00:12:40.515 "copy": false, 00:12:40.515 "nvme_iov_md": false 00:12:40.515 }, 00:12:40.515 "memory_domains": [ 00:12:40.515 { 00:12:40.515 "dma_device_id": "system", 00:12:40.515 "dma_device_type": 1 00:12:40.515 }, 00:12:40.515 { 00:12:40.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.515 "dma_device_type": 2 00:12:40.515 }, 00:12:40.515 { 00:12:40.515 "dma_device_id": "system", 00:12:40.515 "dma_device_type": 1 00:12:40.515 }, 00:12:40.515 { 00:12:40.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.515 "dma_device_type": 2 00:12:40.515 } 00:12:40.515 ], 00:12:40.515 "driver_specific": { 00:12:40.515 "raid": { 00:12:40.515 "uuid": "9ea25f34-cbed-46f1-85fa-0d21b60893c9", 00:12:40.515 "strip_size_kb": 64, 00:12:40.515 "state": "online", 00:12:40.515 "raid_level": "concat", 00:12:40.515 "superblock": true, 00:12:40.515 "num_base_bdevs": 2, 00:12:40.515 "num_base_bdevs_discovered": 2, 00:12:40.515 "num_base_bdevs_operational": 2, 00:12:40.515 "base_bdevs_list": [ 00:12:40.515 { 00:12:40.515 "name": "BaseBdev1", 00:12:40.515 "uuid": "7e992478-7d21-4afb-b9cd-f8f456cf7f8e", 00:12:40.515 "is_configured": true, 00:12:40.515 "data_offset": 2048, 00:12:40.515 "data_size": 63488 00:12:40.515 }, 00:12:40.515 { 00:12:40.515 "name": "BaseBdev2", 00:12:40.515 "uuid": "6b642c92-46d9-4785-95d5-a218b5e1d5f2", 00:12:40.515 "is_configured": true, 00:12:40.515 "data_offset": 2048, 00:12:40.516 "data_size": 63488 00:12:40.516 } 00:12:40.516 ] 00:12:40.516 } 00:12:40.516 } 00:12:40.516 }' 00:12:40.516 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:40.516 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:40.516 BaseBdev2' 00:12:40.516 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:40.775 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:40.775 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:40.775 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:40.775 "name": "BaseBdev1", 00:12:40.775 "aliases": [ 00:12:40.775 "7e992478-7d21-4afb-b9cd-f8f456cf7f8e" 00:12:40.775 ], 00:12:40.775 "product_name": "Malloc disk", 00:12:40.775 "block_size": 512, 00:12:40.775 "num_blocks": 65536, 00:12:40.775 "uuid": "7e992478-7d21-4afb-b9cd-f8f456cf7f8e", 00:12:40.775 "assigned_rate_limits": { 00:12:40.775 "rw_ios_per_sec": 0, 00:12:40.775 "rw_mbytes_per_sec": 0, 00:12:40.775 "r_mbytes_per_sec": 0, 00:12:40.775 "w_mbytes_per_sec": 0 00:12:40.775 }, 00:12:40.775 "claimed": true, 00:12:40.775 "claim_type": "exclusive_write", 00:12:40.775 "zoned": false, 00:12:40.775 "supported_io_types": { 00:12:40.775 "read": true, 00:12:40.775 "write": true, 00:12:40.775 "unmap": true, 00:12:40.775 "flush": true, 00:12:40.775 "reset": true, 00:12:40.775 "nvme_admin": false, 00:12:40.775 "nvme_io": false, 00:12:40.775 "nvme_io_md": false, 00:12:40.775 "write_zeroes": true, 00:12:40.775 "zcopy": true, 00:12:40.775 "get_zone_info": false, 00:12:40.775 "zone_management": false, 00:12:40.775 "zone_append": false, 00:12:40.775 "compare": false, 00:12:40.775 "compare_and_write": false, 00:12:40.775 "abort": true, 00:12:40.775 "seek_hole": false, 00:12:40.775 "seek_data": false, 00:12:40.775 "copy": true, 00:12:40.775 "nvme_iov_md": false 00:12:40.775 }, 00:12:40.775 "memory_domains": [ 00:12:40.775 { 00:12:40.775 "dma_device_id": "system", 00:12:40.775 "dma_device_type": 1 00:12:40.775 }, 00:12:40.775 { 00:12:40.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.775 "dma_device_type": 2 00:12:40.775 } 00:12:40.775 ], 00:12:40.775 "driver_specific": {} 00:12:40.775 }' 00:12:40.775 10:20:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.034 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.292 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.292 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.292 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:41.292 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:41.292 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:41.550 "name": "BaseBdev2", 00:12:41.550 "aliases": [ 00:12:41.550 "6b642c92-46d9-4785-95d5-a218b5e1d5f2" 00:12:41.550 ], 00:12:41.550 "product_name": "Malloc disk", 00:12:41.550 "block_size": 512, 00:12:41.550 "num_blocks": 65536, 00:12:41.550 "uuid": "6b642c92-46d9-4785-95d5-a218b5e1d5f2", 00:12:41.550 "assigned_rate_limits": { 00:12:41.550 "rw_ios_per_sec": 0, 00:12:41.550 "rw_mbytes_per_sec": 0, 00:12:41.550 "r_mbytes_per_sec": 0, 00:12:41.550 "w_mbytes_per_sec": 0 00:12:41.550 }, 00:12:41.550 "claimed": true, 00:12:41.550 "claim_type": "exclusive_write", 00:12:41.550 "zoned": false, 00:12:41.550 "supported_io_types": { 00:12:41.550 "read": true, 00:12:41.550 "write": true, 00:12:41.550 "unmap": true, 00:12:41.550 "flush": true, 00:12:41.550 "reset": true, 00:12:41.550 "nvme_admin": false, 00:12:41.550 "nvme_io": false, 00:12:41.550 "nvme_io_md": false, 00:12:41.550 "write_zeroes": true, 00:12:41.550 "zcopy": true, 00:12:41.550 "get_zone_info": false, 00:12:41.550 "zone_management": false, 00:12:41.550 "zone_append": false, 00:12:41.550 "compare": false, 00:12:41.550 "compare_and_write": false, 00:12:41.550 "abort": true, 00:12:41.550 "seek_hole": false, 00:12:41.550 "seek_data": false, 00:12:41.550 "copy": true, 00:12:41.550 "nvme_iov_md": false 00:12:41.550 }, 00:12:41.550 "memory_domains": [ 00:12:41.550 { 00:12:41.550 "dma_device_id": "system", 00:12:41.550 "dma_device_type": 1 00:12:41.550 }, 00:12:41.550 { 00:12:41.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.550 "dma_device_type": 2 00:12:41.550 } 00:12:41.550 ], 00:12:41.550 "driver_specific": {} 00:12:41.550 }' 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:41.550 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.810 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.810 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.810 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.810 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.810 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.810 10:20:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:42.070 [2024-07-15 10:20:19.097819] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:42.070 [2024-07-15 10:20:19.097845] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:42.070 [2024-07-15 10:20:19.097886] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.070 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.330 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.330 "name": "Existed_Raid", 00:12:42.330 "uuid": "9ea25f34-cbed-46f1-85fa-0d21b60893c9", 00:12:42.330 "strip_size_kb": 64, 00:12:42.330 "state": "offline", 00:12:42.330 "raid_level": "concat", 00:12:42.330 "superblock": true, 00:12:42.330 "num_base_bdevs": 2, 00:12:42.330 "num_base_bdevs_discovered": 1, 00:12:42.330 "num_base_bdevs_operational": 1, 00:12:42.330 "base_bdevs_list": [ 00:12:42.330 { 00:12:42.330 "name": null, 00:12:42.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.330 "is_configured": false, 00:12:42.330 "data_offset": 2048, 00:12:42.330 "data_size": 63488 00:12:42.330 }, 00:12:42.330 { 00:12:42.330 "name": "BaseBdev2", 00:12:42.330 "uuid": "6b642c92-46d9-4785-95d5-a218b5e1d5f2", 00:12:42.330 "is_configured": true, 00:12:42.330 "data_offset": 2048, 00:12:42.330 "data_size": 63488 00:12:42.330 } 00:12:42.330 ] 00:12:42.330 }' 00:12:42.330 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.330 10:20:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:42.898 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:42.898 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:42.898 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.898 10:20:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:43.157 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:43.157 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:43.157 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:43.416 [2024-07-15 10:20:20.439262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:43.416 [2024-07-15 10:20:20.439314] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb19000 name Existed_Raid, state offline 00:12:43.416 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:43.416 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:43.416 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.416 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 481749 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 481749 ']' 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 481749 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 481749 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 481749' 00:12:43.675 killing process with pid 481749 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 481749 00:12:43.675 [2024-07-15 10:20:20.766280] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:43.675 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 481749 00:12:43.675 [2024-07-15 10:20:20.767235] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:43.935 10:20:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:43.935 00:12:43.935 real 0m10.921s 00:12:43.935 user 0m19.480s 00:12:43.935 sys 0m1.960s 00:12:43.935 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:43.935 10:20:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:43.935 ************************************ 00:12:43.935 END TEST raid_state_function_test_sb 00:12:43.935 ************************************ 00:12:43.935 10:20:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:43.935 10:20:21 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:43.935 10:20:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:43.935 10:20:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:43.935 10:20:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:43.935 ************************************ 00:12:43.935 START TEST raid_superblock_test 00:12:43.935 ************************************ 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=483388 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 483388 /var/tmp/spdk-raid.sock 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 483388 ']' 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:43.935 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:43.935 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.935 [2024-07-15 10:20:21.132418] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:12:43.935 [2024-07-15 10:20:21.132476] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483388 ] 00:12:44.211 [2024-07-15 10:20:21.245906] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:44.211 [2024-07-15 10:20:21.347472] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:44.471 [2024-07-15 10:20:21.422666] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:44.471 [2024-07-15 10:20:21.422704] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:45.039 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:45.039 10:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:45.039 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:45.039 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:45.039 10:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:45.039 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:45.039 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:45.039 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:45.039 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:45.039 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:45.039 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:45.298 malloc1 00:12:45.298 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:45.298 [2024-07-15 10:20:22.473824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:45.298 [2024-07-15 10:20:22.473873] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:45.298 [2024-07-15 10:20:22.473892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13d0570 00:12:45.298 [2024-07-15 10:20:22.473905] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:45.298 [2024-07-15 10:20:22.475467] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:45.298 [2024-07-15 10:20:22.475495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:45.298 pt1 00:12:45.298 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:45.298 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:45.298 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:45.298 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:45.298 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:45.298 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:45.557 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:45.557 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:45.557 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:45.557 malloc2 00:12:45.557 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:45.817 [2024-07-15 10:20:22.975989] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:45.817 [2024-07-15 10:20:22.976035] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:45.817 [2024-07-15 10:20:22.976051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13d1970 00:12:45.817 [2024-07-15 10:20:22.976064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:45.817 [2024-07-15 10:20:22.977538] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:45.817 [2024-07-15 10:20:22.977565] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:45.817 pt2 00:12:45.817 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:45.817 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:45.817 10:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:46.077 [2024-07-15 10:20:23.216642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:46.077 [2024-07-15 10:20:23.217883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:46.077 [2024-07-15 10:20:23.218035] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1574270 00:12:46.077 [2024-07-15 10:20:23.218049] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:46.077 [2024-07-15 10:20:23.218239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1569c10 00:12:46.077 [2024-07-15 10:20:23.218384] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1574270 00:12:46.077 [2024-07-15 10:20:23.218394] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1574270 00:12:46.077 [2024-07-15 10:20:23.218494] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:46.077 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.335 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.335 "name": "raid_bdev1", 00:12:46.335 "uuid": "1ae014b2-579a-4777-9102-763d27171d26", 00:12:46.335 "strip_size_kb": 64, 00:12:46.335 "state": "online", 00:12:46.335 "raid_level": "concat", 00:12:46.335 "superblock": true, 00:12:46.335 "num_base_bdevs": 2, 00:12:46.335 "num_base_bdevs_discovered": 2, 00:12:46.335 "num_base_bdevs_operational": 2, 00:12:46.335 "base_bdevs_list": [ 00:12:46.335 { 00:12:46.335 "name": "pt1", 00:12:46.335 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:46.335 "is_configured": true, 00:12:46.335 "data_offset": 2048, 00:12:46.335 "data_size": 63488 00:12:46.335 }, 00:12:46.335 { 00:12:46.335 "name": "pt2", 00:12:46.335 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:46.335 "is_configured": true, 00:12:46.335 "data_offset": 2048, 00:12:46.335 "data_size": 63488 00:12:46.335 } 00:12:46.335 ] 00:12:46.335 }' 00:12:46.335 10:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.335 10:20:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:46.902 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:47.160 [2024-07-15 10:20:24.299959] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:47.160 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:47.160 "name": "raid_bdev1", 00:12:47.160 "aliases": [ 00:12:47.160 "1ae014b2-579a-4777-9102-763d27171d26" 00:12:47.160 ], 00:12:47.160 "product_name": "Raid Volume", 00:12:47.160 "block_size": 512, 00:12:47.160 "num_blocks": 126976, 00:12:47.160 "uuid": "1ae014b2-579a-4777-9102-763d27171d26", 00:12:47.160 "assigned_rate_limits": { 00:12:47.160 "rw_ios_per_sec": 0, 00:12:47.160 "rw_mbytes_per_sec": 0, 00:12:47.160 "r_mbytes_per_sec": 0, 00:12:47.160 "w_mbytes_per_sec": 0 00:12:47.160 }, 00:12:47.160 "claimed": false, 00:12:47.160 "zoned": false, 00:12:47.160 "supported_io_types": { 00:12:47.160 "read": true, 00:12:47.160 "write": true, 00:12:47.160 "unmap": true, 00:12:47.160 "flush": true, 00:12:47.160 "reset": true, 00:12:47.160 "nvme_admin": false, 00:12:47.160 "nvme_io": false, 00:12:47.160 "nvme_io_md": false, 00:12:47.160 "write_zeroes": true, 00:12:47.160 "zcopy": false, 00:12:47.160 "get_zone_info": false, 00:12:47.160 "zone_management": false, 00:12:47.160 "zone_append": false, 00:12:47.160 "compare": false, 00:12:47.160 "compare_and_write": false, 00:12:47.160 "abort": false, 00:12:47.160 "seek_hole": false, 00:12:47.160 "seek_data": false, 00:12:47.160 "copy": false, 00:12:47.160 "nvme_iov_md": false 00:12:47.160 }, 00:12:47.160 "memory_domains": [ 00:12:47.160 { 00:12:47.160 "dma_device_id": "system", 00:12:47.160 "dma_device_type": 1 00:12:47.160 }, 00:12:47.160 { 00:12:47.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.160 "dma_device_type": 2 00:12:47.160 }, 00:12:47.160 { 00:12:47.160 "dma_device_id": "system", 00:12:47.160 "dma_device_type": 1 00:12:47.160 }, 00:12:47.160 { 00:12:47.160 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.160 "dma_device_type": 2 00:12:47.160 } 00:12:47.160 ], 00:12:47.160 "driver_specific": { 00:12:47.160 "raid": { 00:12:47.160 "uuid": "1ae014b2-579a-4777-9102-763d27171d26", 00:12:47.160 "strip_size_kb": 64, 00:12:47.160 "state": "online", 00:12:47.160 "raid_level": "concat", 00:12:47.160 "superblock": true, 00:12:47.160 "num_base_bdevs": 2, 00:12:47.160 "num_base_bdevs_discovered": 2, 00:12:47.160 "num_base_bdevs_operational": 2, 00:12:47.161 "base_bdevs_list": [ 00:12:47.161 { 00:12:47.161 "name": "pt1", 00:12:47.161 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:47.161 "is_configured": true, 00:12:47.161 "data_offset": 2048, 00:12:47.161 "data_size": 63488 00:12:47.161 }, 00:12:47.161 { 00:12:47.161 "name": "pt2", 00:12:47.161 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:47.161 "is_configured": true, 00:12:47.161 "data_offset": 2048, 00:12:47.161 "data_size": 63488 00:12:47.161 } 00:12:47.161 ] 00:12:47.161 } 00:12:47.161 } 00:12:47.161 }' 00:12:47.161 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:47.421 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:47.421 pt2' 00:12:47.421 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.421 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:47.421 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.421 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.421 "name": "pt1", 00:12:47.421 "aliases": [ 00:12:47.421 "00000000-0000-0000-0000-000000000001" 00:12:47.421 ], 00:12:47.421 "product_name": "passthru", 00:12:47.421 "block_size": 512, 00:12:47.421 "num_blocks": 65536, 00:12:47.421 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:47.421 "assigned_rate_limits": { 00:12:47.421 "rw_ios_per_sec": 0, 00:12:47.421 "rw_mbytes_per_sec": 0, 00:12:47.421 "r_mbytes_per_sec": 0, 00:12:47.421 "w_mbytes_per_sec": 0 00:12:47.421 }, 00:12:47.421 "claimed": true, 00:12:47.421 "claim_type": "exclusive_write", 00:12:47.421 "zoned": false, 00:12:47.421 "supported_io_types": { 00:12:47.421 "read": true, 00:12:47.421 "write": true, 00:12:47.421 "unmap": true, 00:12:47.421 "flush": true, 00:12:47.421 "reset": true, 00:12:47.421 "nvme_admin": false, 00:12:47.421 "nvme_io": false, 00:12:47.421 "nvme_io_md": false, 00:12:47.421 "write_zeroes": true, 00:12:47.421 "zcopy": true, 00:12:47.421 "get_zone_info": false, 00:12:47.421 "zone_management": false, 00:12:47.421 "zone_append": false, 00:12:47.421 "compare": false, 00:12:47.421 "compare_and_write": false, 00:12:47.421 "abort": true, 00:12:47.421 "seek_hole": false, 00:12:47.421 "seek_data": false, 00:12:47.421 "copy": true, 00:12:47.421 "nvme_iov_md": false 00:12:47.421 }, 00:12:47.421 "memory_domains": [ 00:12:47.421 { 00:12:47.421 "dma_device_id": "system", 00:12:47.421 "dma_device_type": 1 00:12:47.421 }, 00:12:47.421 { 00:12:47.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.421 "dma_device_type": 2 00:12:47.421 } 00:12:47.421 ], 00:12:47.421 "driver_specific": { 00:12:47.421 "passthru": { 00:12:47.421 "name": "pt1", 00:12:47.421 "base_bdev_name": "malloc1" 00:12:47.421 } 00:12:47.421 } 00:12:47.421 }' 00:12:47.421 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.681 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.940 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:47.940 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.940 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:47.940 10:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.940 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.940 "name": "pt2", 00:12:47.940 "aliases": [ 00:12:47.940 "00000000-0000-0000-0000-000000000002" 00:12:47.940 ], 00:12:47.940 "product_name": "passthru", 00:12:47.940 "block_size": 512, 00:12:47.940 "num_blocks": 65536, 00:12:47.940 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:47.940 "assigned_rate_limits": { 00:12:47.940 "rw_ios_per_sec": 0, 00:12:47.940 "rw_mbytes_per_sec": 0, 00:12:47.940 "r_mbytes_per_sec": 0, 00:12:47.940 "w_mbytes_per_sec": 0 00:12:47.940 }, 00:12:47.940 "claimed": true, 00:12:47.940 "claim_type": "exclusive_write", 00:12:47.940 "zoned": false, 00:12:47.940 "supported_io_types": { 00:12:47.940 "read": true, 00:12:47.940 "write": true, 00:12:47.940 "unmap": true, 00:12:47.940 "flush": true, 00:12:47.940 "reset": true, 00:12:47.940 "nvme_admin": false, 00:12:47.940 "nvme_io": false, 00:12:47.940 "nvme_io_md": false, 00:12:47.940 "write_zeroes": true, 00:12:47.940 "zcopy": true, 00:12:47.940 "get_zone_info": false, 00:12:47.940 "zone_management": false, 00:12:47.940 "zone_append": false, 00:12:47.940 "compare": false, 00:12:47.940 "compare_and_write": false, 00:12:47.940 "abort": true, 00:12:47.940 "seek_hole": false, 00:12:47.940 "seek_data": false, 00:12:47.940 "copy": true, 00:12:47.940 "nvme_iov_md": false 00:12:47.940 }, 00:12:47.940 "memory_domains": [ 00:12:47.940 { 00:12:47.940 "dma_device_id": "system", 00:12:47.940 "dma_device_type": 1 00:12:47.940 }, 00:12:47.940 { 00:12:47.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.940 "dma_device_type": 2 00:12:47.940 } 00:12:47.940 ], 00:12:47.940 "driver_specific": { 00:12:47.940 "passthru": { 00:12:47.940 "name": "pt2", 00:12:47.940 "base_bdev_name": "malloc2" 00:12:47.940 } 00:12:47.940 } 00:12:47.940 }' 00:12:47.940 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.940 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:48.200 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.458 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:48.458 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:48.458 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:48.458 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:49.026 [2024-07-15 10:20:25.932324] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:49.026 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1ae014b2-579a-4777-9102-763d27171d26 00:12:49.026 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1ae014b2-579a-4777-9102-763d27171d26 ']' 00:12:49.026 10:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:49.026 [2024-07-15 10:20:26.180713] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:49.026 [2024-07-15 10:20:26.180737] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:49.026 [2024-07-15 10:20:26.180793] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:49.026 [2024-07-15 10:20:26.180838] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:49.026 [2024-07-15 10:20:26.180857] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1574270 name raid_bdev1, state offline 00:12:49.026 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.026 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:49.285 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:49.285 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:49.285 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:49.285 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:49.544 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:49.544 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:49.803 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:49.803 10:20:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:50.063 [2024-07-15 10:20:27.231477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:50.063 [2024-07-15 10:20:27.232899] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:50.063 [2024-07-15 10:20:27.232966] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:50.063 [2024-07-15 10:20:27.233008] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:50.063 [2024-07-15 10:20:27.233027] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:50.063 [2024-07-15 10:20:27.233043] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1573ff0 name raid_bdev1, state configuring 00:12:50.063 request: 00:12:50.063 { 00:12:50.063 "name": "raid_bdev1", 00:12:50.063 "raid_level": "concat", 00:12:50.063 "base_bdevs": [ 00:12:50.063 "malloc1", 00:12:50.063 "malloc2" 00:12:50.063 ], 00:12:50.063 "strip_size_kb": 64, 00:12:50.063 "superblock": false, 00:12:50.063 "method": "bdev_raid_create", 00:12:50.063 "req_id": 1 00:12:50.063 } 00:12:50.063 Got JSON-RPC error response 00:12:50.063 response: 00:12:50.063 { 00:12:50.063 "code": -17, 00:12:50.063 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:50.063 } 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.063 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:50.323 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:50.323 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:50.323 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:50.580 [2024-07-15 10:20:27.724722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:50.580 [2024-07-15 10:20:27.724769] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:50.581 [2024-07-15 10:20:27.724792] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13d07a0 00:12:50.581 [2024-07-15 10:20:27.724804] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:50.581 [2024-07-15 10:20:27.726406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:50.581 [2024-07-15 10:20:27.726435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:50.581 [2024-07-15 10:20:27.726511] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:50.581 [2024-07-15 10:20:27.726538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:50.581 pt1 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.581 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:50.838 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:50.838 "name": "raid_bdev1", 00:12:50.839 "uuid": "1ae014b2-579a-4777-9102-763d27171d26", 00:12:50.839 "strip_size_kb": 64, 00:12:50.839 "state": "configuring", 00:12:50.839 "raid_level": "concat", 00:12:50.839 "superblock": true, 00:12:50.839 "num_base_bdevs": 2, 00:12:50.839 "num_base_bdevs_discovered": 1, 00:12:50.839 "num_base_bdevs_operational": 2, 00:12:50.839 "base_bdevs_list": [ 00:12:50.839 { 00:12:50.839 "name": "pt1", 00:12:50.839 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:50.839 "is_configured": true, 00:12:50.839 "data_offset": 2048, 00:12:50.839 "data_size": 63488 00:12:50.839 }, 00:12:50.839 { 00:12:50.839 "name": null, 00:12:50.839 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:50.839 "is_configured": false, 00:12:50.839 "data_offset": 2048, 00:12:50.839 "data_size": 63488 00:12:50.839 } 00:12:50.839 ] 00:12:50.839 }' 00:12:50.839 10:20:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:50.839 10:20:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:51.776 [2024-07-15 10:20:28.867774] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:51.776 [2024-07-15 10:20:28.867827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.776 [2024-07-15 10:20:28.867845] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x156a820 00:12:51.776 [2024-07-15 10:20:28.867858] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.776 [2024-07-15 10:20:28.868222] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.776 [2024-07-15 10:20:28.868241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:51.776 [2024-07-15 10:20:28.868303] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:51.776 [2024-07-15 10:20:28.868322] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:51.776 [2024-07-15 10:20:28.868420] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13c6ec0 00:12:51.776 [2024-07-15 10:20:28.868430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:51.776 [2024-07-15 10:20:28.868601] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13c7f00 00:12:51.776 [2024-07-15 10:20:28.868729] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13c6ec0 00:12:51.776 [2024-07-15 10:20:28.868739] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13c6ec0 00:12:51.776 [2024-07-15 10:20:28.868843] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:51.776 pt2 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:51.776 10:20:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.036 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:52.036 "name": "raid_bdev1", 00:12:52.036 "uuid": "1ae014b2-579a-4777-9102-763d27171d26", 00:12:52.036 "strip_size_kb": 64, 00:12:52.036 "state": "online", 00:12:52.036 "raid_level": "concat", 00:12:52.036 "superblock": true, 00:12:52.036 "num_base_bdevs": 2, 00:12:52.036 "num_base_bdevs_discovered": 2, 00:12:52.036 "num_base_bdevs_operational": 2, 00:12:52.036 "base_bdevs_list": [ 00:12:52.036 { 00:12:52.036 "name": "pt1", 00:12:52.036 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:52.036 "is_configured": true, 00:12:52.036 "data_offset": 2048, 00:12:52.036 "data_size": 63488 00:12:52.036 }, 00:12:52.036 { 00:12:52.036 "name": "pt2", 00:12:52.036 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:52.036 "is_configured": true, 00:12:52.036 "data_offset": 2048, 00:12:52.036 "data_size": 63488 00:12:52.036 } 00:12:52.036 ] 00:12:52.036 }' 00:12:52.036 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:52.036 10:20:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.603 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:52.603 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:52.603 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:52.603 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:52.604 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:52.604 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:52.604 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:52.604 10:20:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:53.171 [2024-07-15 10:20:30.227681] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:53.171 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:53.171 "name": "raid_bdev1", 00:12:53.171 "aliases": [ 00:12:53.171 "1ae014b2-579a-4777-9102-763d27171d26" 00:12:53.171 ], 00:12:53.171 "product_name": "Raid Volume", 00:12:53.171 "block_size": 512, 00:12:53.171 "num_blocks": 126976, 00:12:53.171 "uuid": "1ae014b2-579a-4777-9102-763d27171d26", 00:12:53.171 "assigned_rate_limits": { 00:12:53.171 "rw_ios_per_sec": 0, 00:12:53.171 "rw_mbytes_per_sec": 0, 00:12:53.171 "r_mbytes_per_sec": 0, 00:12:53.171 "w_mbytes_per_sec": 0 00:12:53.171 }, 00:12:53.171 "claimed": false, 00:12:53.171 "zoned": false, 00:12:53.171 "supported_io_types": { 00:12:53.171 "read": true, 00:12:53.171 "write": true, 00:12:53.171 "unmap": true, 00:12:53.171 "flush": true, 00:12:53.171 "reset": true, 00:12:53.171 "nvme_admin": false, 00:12:53.171 "nvme_io": false, 00:12:53.171 "nvme_io_md": false, 00:12:53.171 "write_zeroes": true, 00:12:53.171 "zcopy": false, 00:12:53.171 "get_zone_info": false, 00:12:53.171 "zone_management": false, 00:12:53.171 "zone_append": false, 00:12:53.171 "compare": false, 00:12:53.171 "compare_and_write": false, 00:12:53.171 "abort": false, 00:12:53.171 "seek_hole": false, 00:12:53.171 "seek_data": false, 00:12:53.171 "copy": false, 00:12:53.171 "nvme_iov_md": false 00:12:53.171 }, 00:12:53.171 "memory_domains": [ 00:12:53.171 { 00:12:53.171 "dma_device_id": "system", 00:12:53.171 "dma_device_type": 1 00:12:53.171 }, 00:12:53.171 { 00:12:53.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.171 "dma_device_type": 2 00:12:53.171 }, 00:12:53.171 { 00:12:53.171 "dma_device_id": "system", 00:12:53.171 "dma_device_type": 1 00:12:53.171 }, 00:12:53.171 { 00:12:53.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.171 "dma_device_type": 2 00:12:53.171 } 00:12:53.171 ], 00:12:53.171 "driver_specific": { 00:12:53.171 "raid": { 00:12:53.171 "uuid": "1ae014b2-579a-4777-9102-763d27171d26", 00:12:53.171 "strip_size_kb": 64, 00:12:53.171 "state": "online", 00:12:53.172 "raid_level": "concat", 00:12:53.172 "superblock": true, 00:12:53.172 "num_base_bdevs": 2, 00:12:53.172 "num_base_bdevs_discovered": 2, 00:12:53.172 "num_base_bdevs_operational": 2, 00:12:53.172 "base_bdevs_list": [ 00:12:53.172 { 00:12:53.172 "name": "pt1", 00:12:53.172 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:53.172 "is_configured": true, 00:12:53.172 "data_offset": 2048, 00:12:53.172 "data_size": 63488 00:12:53.172 }, 00:12:53.172 { 00:12:53.172 "name": "pt2", 00:12:53.172 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:53.172 "is_configured": true, 00:12:53.172 "data_offset": 2048, 00:12:53.172 "data_size": 63488 00:12:53.172 } 00:12:53.172 ] 00:12:53.172 } 00:12:53.172 } 00:12:53.172 }' 00:12:53.172 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:53.172 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:53.172 pt2' 00:12:53.172 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:53.172 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:53.172 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:53.740 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:53.740 "name": "pt1", 00:12:53.740 "aliases": [ 00:12:53.740 "00000000-0000-0000-0000-000000000001" 00:12:53.740 ], 00:12:53.740 "product_name": "passthru", 00:12:53.740 "block_size": 512, 00:12:53.740 "num_blocks": 65536, 00:12:53.740 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:53.740 "assigned_rate_limits": { 00:12:53.740 "rw_ios_per_sec": 0, 00:12:53.740 "rw_mbytes_per_sec": 0, 00:12:53.740 "r_mbytes_per_sec": 0, 00:12:53.740 "w_mbytes_per_sec": 0 00:12:53.740 }, 00:12:53.740 "claimed": true, 00:12:53.740 "claim_type": "exclusive_write", 00:12:53.740 "zoned": false, 00:12:53.740 "supported_io_types": { 00:12:53.740 "read": true, 00:12:53.740 "write": true, 00:12:53.740 "unmap": true, 00:12:53.740 "flush": true, 00:12:53.740 "reset": true, 00:12:53.740 "nvme_admin": false, 00:12:53.740 "nvme_io": false, 00:12:53.740 "nvme_io_md": false, 00:12:53.740 "write_zeroes": true, 00:12:53.740 "zcopy": true, 00:12:53.740 "get_zone_info": false, 00:12:53.740 "zone_management": false, 00:12:53.740 "zone_append": false, 00:12:53.740 "compare": false, 00:12:53.740 "compare_and_write": false, 00:12:53.740 "abort": true, 00:12:53.740 "seek_hole": false, 00:12:53.740 "seek_data": false, 00:12:53.740 "copy": true, 00:12:53.740 "nvme_iov_md": false 00:12:53.740 }, 00:12:53.740 "memory_domains": [ 00:12:53.740 { 00:12:53.740 "dma_device_id": "system", 00:12:53.740 "dma_device_type": 1 00:12:53.740 }, 00:12:53.740 { 00:12:53.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.740 "dma_device_type": 2 00:12:53.740 } 00:12:53.740 ], 00:12:53.740 "driver_specific": { 00:12:53.740 "passthru": { 00:12:53.740 "name": "pt1", 00:12:53.740 "base_bdev_name": "malloc1" 00:12:53.740 } 00:12:53.740 } 00:12:53.740 }' 00:12:53.740 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.740 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.740 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:53.740 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.999 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.999 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:53.999 10:20:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:53.999 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:54.274 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:54.274 "name": "pt2", 00:12:54.274 "aliases": [ 00:12:54.274 "00000000-0000-0000-0000-000000000002" 00:12:54.274 ], 00:12:54.274 "product_name": "passthru", 00:12:54.274 "block_size": 512, 00:12:54.274 "num_blocks": 65536, 00:12:54.274 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:54.274 "assigned_rate_limits": { 00:12:54.274 "rw_ios_per_sec": 0, 00:12:54.274 "rw_mbytes_per_sec": 0, 00:12:54.274 "r_mbytes_per_sec": 0, 00:12:54.274 "w_mbytes_per_sec": 0 00:12:54.274 }, 00:12:54.274 "claimed": true, 00:12:54.274 "claim_type": "exclusive_write", 00:12:54.274 "zoned": false, 00:12:54.274 "supported_io_types": { 00:12:54.274 "read": true, 00:12:54.274 "write": true, 00:12:54.274 "unmap": true, 00:12:54.274 "flush": true, 00:12:54.274 "reset": true, 00:12:54.274 "nvme_admin": false, 00:12:54.274 "nvme_io": false, 00:12:54.274 "nvme_io_md": false, 00:12:54.274 "write_zeroes": true, 00:12:54.274 "zcopy": true, 00:12:54.274 "get_zone_info": false, 00:12:54.274 "zone_management": false, 00:12:54.274 "zone_append": false, 00:12:54.274 "compare": false, 00:12:54.274 "compare_and_write": false, 00:12:54.274 "abort": true, 00:12:54.274 "seek_hole": false, 00:12:54.274 "seek_data": false, 00:12:54.274 "copy": true, 00:12:54.274 "nvme_iov_md": false 00:12:54.274 }, 00:12:54.274 "memory_domains": [ 00:12:54.274 { 00:12:54.274 "dma_device_id": "system", 00:12:54.274 "dma_device_type": 1 00:12:54.274 }, 00:12:54.274 { 00:12:54.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:54.274 "dma_device_type": 2 00:12:54.274 } 00:12:54.274 ], 00:12:54.274 "driver_specific": { 00:12:54.274 "passthru": { 00:12:54.274 "name": "pt2", 00:12:54.274 "base_bdev_name": "malloc2" 00:12:54.274 } 00:12:54.274 } 00:12:54.274 }' 00:12:54.274 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.274 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:54.274 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:54.274 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:54.555 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:54.814 [2024-07-15 10:20:31.944276] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1ae014b2-579a-4777-9102-763d27171d26 '!=' 1ae014b2-579a-4777-9102-763d27171d26 ']' 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 483388 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 483388 ']' 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 483388 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:54.814 10:20:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 483388 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 483388' 00:12:55.073 killing process with pid 483388 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 483388 00:12:55.073 [2024-07-15 10:20:32.031134] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:55.073 [2024-07-15 10:20:32.031194] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:55.073 [2024-07-15 10:20:32.031240] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:55.073 [2024-07-15 10:20:32.031253] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13c6ec0 name raid_bdev1, state offline 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 483388 00:12:55.073 [2024-07-15 10:20:32.048988] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:55.073 00:12:55.073 real 0m11.195s 00:12:55.073 user 0m19.961s 00:12:55.073 sys 0m2.038s 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:55.073 10:20:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.073 ************************************ 00:12:55.073 END TEST raid_superblock_test 00:12:55.073 ************************************ 00:12:55.333 10:20:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:55.333 10:20:32 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:55.333 10:20:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:55.333 10:20:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:55.333 10:20:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:55.333 ************************************ 00:12:55.333 START TEST raid_read_error_test 00:12:55.333 ************************************ 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:55.333 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.INKWhAf0Ce 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=485137 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 485137 /var/tmp/spdk-raid.sock 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 485137 ']' 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:55.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:55.334 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.334 [2024-07-15 10:20:32.416893] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:12:55.334 [2024-07-15 10:20:32.416963] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485137 ] 00:12:55.593 [2024-07-15 10:20:32.546201] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:55.593 [2024-07-15 10:20:32.650585] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:55.593 [2024-07-15 10:20:32.715116] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.593 [2024-07-15 10:20:32.715139] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.852 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:55.852 10:20:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:55.852 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:55.852 10:20:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:56.111 BaseBdev1_malloc 00:12:56.111 10:20:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:56.370 true 00:12:56.370 10:20:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:56.628 [2024-07-15 10:20:33.572703] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:56.628 [2024-07-15 10:20:33.572748] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:56.628 [2024-07-15 10:20:33.572770] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd650d0 00:12:56.628 [2024-07-15 10:20:33.572783] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:56.628 [2024-07-15 10:20:33.574684] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:56.628 [2024-07-15 10:20:33.574712] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:56.628 BaseBdev1 00:12:56.628 10:20:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:56.628 10:20:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:56.629 BaseBdev2_malloc 00:12:56.887 10:20:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:56.887 true 00:12:57.145 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:57.145 [2024-07-15 10:20:34.240421] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:57.145 [2024-07-15 10:20:34.240463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:57.145 [2024-07-15 10:20:34.240484] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd69910 00:12:57.145 [2024-07-15 10:20:34.240497] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:57.145 [2024-07-15 10:20:34.241977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:57.145 [2024-07-15 10:20:34.242004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:57.145 BaseBdev2 00:12:57.146 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:57.403 [2024-07-15 10:20:34.477083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:57.403 [2024-07-15 10:20:34.478469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:57.403 [2024-07-15 10:20:34.478658] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd6b320 00:12:57.403 [2024-07-15 10:20:34.478671] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:57.403 [2024-07-15 10:20:34.478871] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd6c290 00:12:57.403 [2024-07-15 10:20:34.479026] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd6b320 00:12:57.403 [2024-07-15 10:20:34.479036] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd6b320 00:12:57.403 [2024-07-15 10:20:34.479140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.403 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:57.661 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.661 "name": "raid_bdev1", 00:12:57.661 "uuid": "ee876562-6042-47ce-9ce5-9bf497a30e52", 00:12:57.661 "strip_size_kb": 64, 00:12:57.661 "state": "online", 00:12:57.661 "raid_level": "concat", 00:12:57.661 "superblock": true, 00:12:57.661 "num_base_bdevs": 2, 00:12:57.661 "num_base_bdevs_discovered": 2, 00:12:57.661 "num_base_bdevs_operational": 2, 00:12:57.661 "base_bdevs_list": [ 00:12:57.661 { 00:12:57.661 "name": "BaseBdev1", 00:12:57.661 "uuid": "1d26bd1a-e979-56be-8540-2c16549ab7aa", 00:12:57.661 "is_configured": true, 00:12:57.661 "data_offset": 2048, 00:12:57.661 "data_size": 63488 00:12:57.661 }, 00:12:57.661 { 00:12:57.661 "name": "BaseBdev2", 00:12:57.661 "uuid": "be8b117b-d6a0-561b-bbca-f4082a92974a", 00:12:57.661 "is_configured": true, 00:12:57.661 "data_offset": 2048, 00:12:57.661 "data_size": 63488 00:12:57.661 } 00:12:57.661 ] 00:12:57.661 }' 00:12:57.661 10:20:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.661 10:20:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.595 10:20:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:58.595 10:20:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:58.595 [2024-07-15 10:20:35.636436] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd669b0 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.528 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:59.787 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:59.787 "name": "raid_bdev1", 00:12:59.787 "uuid": "ee876562-6042-47ce-9ce5-9bf497a30e52", 00:12:59.787 "strip_size_kb": 64, 00:12:59.787 "state": "online", 00:12:59.787 "raid_level": "concat", 00:12:59.787 "superblock": true, 00:12:59.787 "num_base_bdevs": 2, 00:12:59.787 "num_base_bdevs_discovered": 2, 00:12:59.787 "num_base_bdevs_operational": 2, 00:12:59.787 "base_bdevs_list": [ 00:12:59.787 { 00:12:59.787 "name": "BaseBdev1", 00:12:59.787 "uuid": "1d26bd1a-e979-56be-8540-2c16549ab7aa", 00:12:59.787 "is_configured": true, 00:12:59.787 "data_offset": 2048, 00:12:59.787 "data_size": 63488 00:12:59.787 }, 00:12:59.787 { 00:12:59.787 "name": "BaseBdev2", 00:12:59.787 "uuid": "be8b117b-d6a0-561b-bbca-f4082a92974a", 00:12:59.787 "is_configured": true, 00:12:59.787 "data_offset": 2048, 00:12:59.787 "data_size": 63488 00:12:59.787 } 00:12:59.787 ] 00:12:59.787 }' 00:12:59.787 10:20:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:59.787 10:20:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.723 10:20:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:00.723 [2024-07-15 10:20:37.775980] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:00.723 [2024-07-15 10:20:37.776012] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:00.723 [2024-07-15 10:20:37.779172] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:00.723 [2024-07-15 10:20:37.779202] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.723 [2024-07-15 10:20:37.779231] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:00.723 [2024-07-15 10:20:37.779242] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd6b320 name raid_bdev1, state offline 00:13:00.723 0 00:13:00.723 10:20:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 485137 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 485137 ']' 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 485137 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 485137 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 485137' 00:13:00.724 killing process with pid 485137 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 485137 00:13:00.724 [2024-07-15 10:20:37.844425] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:00.724 10:20:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 485137 00:13:00.724 [2024-07-15 10:20:37.855232] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.INKWhAf0Ce 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:13:00.983 00:13:00.983 real 0m5.747s 00:13:00.983 user 0m9.284s 00:13:00.983 sys 0m1.074s 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:00.983 10:20:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.983 ************************************ 00:13:00.983 END TEST raid_read_error_test 00:13:00.983 ************************************ 00:13:00.983 10:20:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:00.983 10:20:38 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:13:00.983 10:20:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:00.983 10:20:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:00.983 10:20:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:00.983 ************************************ 00:13:00.983 START TEST raid_write_error_test 00:13:00.983 ************************************ 00:13:00.983 10:20:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Q4CV3wdyBj 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=485985 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 485985 /var/tmp/spdk-raid.sock 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 485985 ']' 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:01.242 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:01.242 10:20:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.242 [2024-07-15 10:20:38.258132] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:13:01.242 [2024-07-15 10:20:38.258202] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485985 ] 00:13:01.242 [2024-07-15 10:20:38.389701] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:01.501 [2024-07-15 10:20:38.491959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:01.501 [2024-07-15 10:20:38.551916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:01.501 [2024-07-15 10:20:38.551953] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:02.068 10:20:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:02.068 10:20:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:02.068 10:20:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:02.068 10:20:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:02.327 BaseBdev1_malloc 00:13:02.327 10:20:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:02.588 true 00:13:02.588 10:20:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:02.847 [2024-07-15 10:20:39.861289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:02.847 [2024-07-15 10:20:39.861334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.847 [2024-07-15 10:20:39.861355] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17c50d0 00:13:02.848 [2024-07-15 10:20:39.861368] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.848 [2024-07-15 10:20:39.863114] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.848 [2024-07-15 10:20:39.863143] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:02.848 BaseBdev1 00:13:02.848 10:20:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:02.848 10:20:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:03.107 BaseBdev2_malloc 00:13:03.107 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:03.107 true 00:13:03.366 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:03.366 [2024-07-15 10:20:40.475572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:03.366 [2024-07-15 10:20:40.475626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.366 [2024-07-15 10:20:40.475647] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17c9910 00:13:03.366 [2024-07-15 10:20:40.475660] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.366 [2024-07-15 10:20:40.477167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.366 [2024-07-15 10:20:40.477195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:03.366 BaseBdev2 00:13:03.366 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:03.626 [2024-07-15 10:20:40.720244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:03.626 [2024-07-15 10:20:40.721446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:03.626 [2024-07-15 10:20:40.721629] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17cb320 00:13:03.626 [2024-07-15 10:20:40.721642] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:03.626 [2024-07-15 10:20:40.721830] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17cc290 00:13:03.626 [2024-07-15 10:20:40.721981] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17cb320 00:13:03.626 [2024-07-15 10:20:40.721992] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17cb320 00:13:03.626 [2024-07-15 10:20:40.722093] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.626 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:03.884 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.884 "name": "raid_bdev1", 00:13:03.884 "uuid": "3196dd9e-95bc-4c40-9b7c-1b8e46284190", 00:13:03.884 "strip_size_kb": 64, 00:13:03.884 "state": "online", 00:13:03.884 "raid_level": "concat", 00:13:03.884 "superblock": true, 00:13:03.884 "num_base_bdevs": 2, 00:13:03.884 "num_base_bdevs_discovered": 2, 00:13:03.884 "num_base_bdevs_operational": 2, 00:13:03.884 "base_bdevs_list": [ 00:13:03.884 { 00:13:03.884 "name": "BaseBdev1", 00:13:03.884 "uuid": "221cd7b9-6ff3-5977-8f80-7b29c40b90fa", 00:13:03.884 "is_configured": true, 00:13:03.884 "data_offset": 2048, 00:13:03.884 "data_size": 63488 00:13:03.884 }, 00:13:03.884 { 00:13:03.884 "name": "BaseBdev2", 00:13:03.884 "uuid": "55b00ace-2d73-5692-8468-85ff87dfe1b7", 00:13:03.884 "is_configured": true, 00:13:03.884 "data_offset": 2048, 00:13:03.884 "data_size": 63488 00:13:03.884 } 00:13:03.884 ] 00:13:03.884 }' 00:13:03.885 10:20:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.885 10:20:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.452 10:20:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:04.452 10:20:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:04.712 [2024-07-15 10:20:41.699133] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17c69b0 00:13:05.648 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:05.648 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:05.648 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:05.648 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:05.648 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.907 10:20:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:05.907 10:20:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.907 "name": "raid_bdev1", 00:13:05.907 "uuid": "3196dd9e-95bc-4c40-9b7c-1b8e46284190", 00:13:05.907 "strip_size_kb": 64, 00:13:05.907 "state": "online", 00:13:05.907 "raid_level": "concat", 00:13:05.907 "superblock": true, 00:13:05.907 "num_base_bdevs": 2, 00:13:05.907 "num_base_bdevs_discovered": 2, 00:13:05.907 "num_base_bdevs_operational": 2, 00:13:05.907 "base_bdevs_list": [ 00:13:05.907 { 00:13:05.907 "name": "BaseBdev1", 00:13:05.907 "uuid": "221cd7b9-6ff3-5977-8f80-7b29c40b90fa", 00:13:05.907 "is_configured": true, 00:13:05.907 "data_offset": 2048, 00:13:05.907 "data_size": 63488 00:13:05.907 }, 00:13:05.907 { 00:13:05.907 "name": "BaseBdev2", 00:13:05.907 "uuid": "55b00ace-2d73-5692-8468-85ff87dfe1b7", 00:13:05.907 "is_configured": true, 00:13:05.907 "data_offset": 2048, 00:13:05.907 "data_size": 63488 00:13:05.907 } 00:13:05.907 ] 00:13:05.907 }' 00:13:05.907 10:20:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.907 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:06.844 [2024-07-15 10:20:43.928526] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:06.844 [2024-07-15 10:20:43.928568] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:06.844 [2024-07-15 10:20:43.931741] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:06.844 [2024-07-15 10:20:43.931772] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:06.844 [2024-07-15 10:20:43.931800] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:06.844 [2024-07-15 10:20:43.931812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17cb320 name raid_bdev1, state offline 00:13:06.844 0 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 485985 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 485985 ']' 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 485985 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 485985 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 485985' 00:13:06.844 killing process with pid 485985 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 485985 00:13:06.844 [2024-07-15 10:20:43.997114] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:06.844 10:20:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 485985 00:13:06.844 [2024-07-15 10:20:44.008343] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Q4CV3wdyBj 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:13:07.103 00:13:07.103 real 0m6.073s 00:13:07.103 user 0m9.445s 00:13:07.103 sys 0m1.059s 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:07.103 10:20:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.103 ************************************ 00:13:07.103 END TEST raid_write_error_test 00:13:07.103 ************************************ 00:13:07.103 10:20:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:07.103 10:20:44 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:07.103 10:20:44 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:13:07.103 10:20:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:07.103 10:20:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:07.363 10:20:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:07.363 ************************************ 00:13:07.363 START TEST raid_state_function_test 00:13:07.363 ************************************ 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=486805 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 486805' 00:13:07.363 Process raid pid: 486805 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 486805 /var/tmp/spdk-raid.sock 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 486805 ']' 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:07.363 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:07.363 10:20:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.363 [2024-07-15 10:20:44.413736] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:13:07.363 [2024-07-15 10:20:44.413811] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:07.363 [2024-07-15 10:20:44.546322] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:07.622 [2024-07-15 10:20:44.653484] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:07.622 [2024-07-15 10:20:44.716343] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.622 [2024-07-15 10:20:44.716370] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.190 10:20:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:08.190 10:20:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:08.190 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:08.449 [2024-07-15 10:20:45.506756] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:08.449 [2024-07-15 10:20:45.506801] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:08.449 [2024-07-15 10:20:45.506811] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:08.449 [2024-07-15 10:20:45.506823] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.450 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.710 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.710 "name": "Existed_Raid", 00:13:08.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.710 "strip_size_kb": 0, 00:13:08.710 "state": "configuring", 00:13:08.710 "raid_level": "raid1", 00:13:08.710 "superblock": false, 00:13:08.710 "num_base_bdevs": 2, 00:13:08.710 "num_base_bdevs_discovered": 0, 00:13:08.710 "num_base_bdevs_operational": 2, 00:13:08.710 "base_bdevs_list": [ 00:13:08.710 { 00:13:08.710 "name": "BaseBdev1", 00:13:08.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.710 "is_configured": false, 00:13:08.710 "data_offset": 0, 00:13:08.710 "data_size": 0 00:13:08.710 }, 00:13:08.710 { 00:13:08.710 "name": "BaseBdev2", 00:13:08.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.710 "is_configured": false, 00:13:08.710 "data_offset": 0, 00:13:08.710 "data_size": 0 00:13:08.710 } 00:13:08.710 ] 00:13:08.710 }' 00:13:08.710 10:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.710 10:20:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.316 10:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:09.575 [2024-07-15 10:20:46.589489] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:09.575 [2024-07-15 10:20:46.589523] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb0fa80 name Existed_Raid, state configuring 00:13:09.575 10:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:09.834 [2024-07-15 10:20:46.834147] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:09.834 [2024-07-15 10:20:46.834178] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:09.834 [2024-07-15 10:20:46.834188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:09.834 [2024-07-15 10:20:46.834200] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:09.834 10:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:10.093 [2024-07-15 10:20:47.088784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.093 BaseBdev1 00:13:10.093 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:10.093 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:10.093 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:10.093 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:10.093 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:10.093 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:10.093 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:10.352 [ 00:13:10.352 { 00:13:10.352 "name": "BaseBdev1", 00:13:10.352 "aliases": [ 00:13:10.352 "2722f6d7-836b-48e0-a485-9ad4db540815" 00:13:10.352 ], 00:13:10.352 "product_name": "Malloc disk", 00:13:10.352 "block_size": 512, 00:13:10.352 "num_blocks": 65536, 00:13:10.352 "uuid": "2722f6d7-836b-48e0-a485-9ad4db540815", 00:13:10.352 "assigned_rate_limits": { 00:13:10.352 "rw_ios_per_sec": 0, 00:13:10.352 "rw_mbytes_per_sec": 0, 00:13:10.352 "r_mbytes_per_sec": 0, 00:13:10.352 "w_mbytes_per_sec": 0 00:13:10.352 }, 00:13:10.352 "claimed": true, 00:13:10.352 "claim_type": "exclusive_write", 00:13:10.352 "zoned": false, 00:13:10.352 "supported_io_types": { 00:13:10.352 "read": true, 00:13:10.352 "write": true, 00:13:10.352 "unmap": true, 00:13:10.352 "flush": true, 00:13:10.352 "reset": true, 00:13:10.352 "nvme_admin": false, 00:13:10.352 "nvme_io": false, 00:13:10.352 "nvme_io_md": false, 00:13:10.352 "write_zeroes": true, 00:13:10.352 "zcopy": true, 00:13:10.352 "get_zone_info": false, 00:13:10.352 "zone_management": false, 00:13:10.352 "zone_append": false, 00:13:10.352 "compare": false, 00:13:10.352 "compare_and_write": false, 00:13:10.352 "abort": true, 00:13:10.352 "seek_hole": false, 00:13:10.352 "seek_data": false, 00:13:10.352 "copy": true, 00:13:10.352 "nvme_iov_md": false 00:13:10.352 }, 00:13:10.352 "memory_domains": [ 00:13:10.352 { 00:13:10.352 "dma_device_id": "system", 00:13:10.352 "dma_device_type": 1 00:13:10.352 }, 00:13:10.352 { 00:13:10.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.352 "dma_device_type": 2 00:13:10.352 } 00:13:10.352 ], 00:13:10.352 "driver_specific": {} 00:13:10.352 } 00:13:10.352 ] 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.352 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:10.611 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.611 "name": "Existed_Raid", 00:13:10.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.611 "strip_size_kb": 0, 00:13:10.611 "state": "configuring", 00:13:10.611 "raid_level": "raid1", 00:13:10.611 "superblock": false, 00:13:10.611 "num_base_bdevs": 2, 00:13:10.611 "num_base_bdevs_discovered": 1, 00:13:10.611 "num_base_bdevs_operational": 2, 00:13:10.611 "base_bdevs_list": [ 00:13:10.611 { 00:13:10.611 "name": "BaseBdev1", 00:13:10.611 "uuid": "2722f6d7-836b-48e0-a485-9ad4db540815", 00:13:10.611 "is_configured": true, 00:13:10.611 "data_offset": 0, 00:13:10.611 "data_size": 65536 00:13:10.611 }, 00:13:10.611 { 00:13:10.611 "name": "BaseBdev2", 00:13:10.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:10.611 "is_configured": false, 00:13:10.611 "data_offset": 0, 00:13:10.611 "data_size": 0 00:13:10.611 } 00:13:10.611 ] 00:13:10.611 }' 00:13:10.611 10:20:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.611 10:20:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.548 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:11.548 [2024-07-15 10:20:48.612830] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:11.548 [2024-07-15 10:20:48.612877] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb0f350 name Existed_Raid, state configuring 00:13:11.548 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:11.807 [2024-07-15 10:20:48.861512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.807 [2024-07-15 10:20:48.863063] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.807 [2024-07-15 10:20:48.863095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.807 10:20:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.066 10:20:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.066 "name": "Existed_Raid", 00:13:12.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.066 "strip_size_kb": 0, 00:13:12.066 "state": "configuring", 00:13:12.066 "raid_level": "raid1", 00:13:12.066 "superblock": false, 00:13:12.066 "num_base_bdevs": 2, 00:13:12.066 "num_base_bdevs_discovered": 1, 00:13:12.066 "num_base_bdevs_operational": 2, 00:13:12.066 "base_bdevs_list": [ 00:13:12.066 { 00:13:12.066 "name": "BaseBdev1", 00:13:12.066 "uuid": "2722f6d7-836b-48e0-a485-9ad4db540815", 00:13:12.066 "is_configured": true, 00:13:12.066 "data_offset": 0, 00:13:12.066 "data_size": 65536 00:13:12.066 }, 00:13:12.066 { 00:13:12.066 "name": "BaseBdev2", 00:13:12.066 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.066 "is_configured": false, 00:13:12.066 "data_offset": 0, 00:13:12.066 "data_size": 0 00:13:12.066 } 00:13:12.066 ] 00:13:12.066 }' 00:13:12.066 10:20:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.066 10:20:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.635 10:20:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:12.894 [2024-07-15 10:20:49.931805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:12.894 [2024-07-15 10:20:49.931847] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb10000 00:13:12.894 [2024-07-15 10:20:49.931856] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:12.894 [2024-07-15 10:20:49.932059] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa2a0c0 00:13:12.894 [2024-07-15 10:20:49.932180] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb10000 00:13:12.894 [2024-07-15 10:20:49.932195] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xb10000 00:13:12.894 [2024-07-15 10:20:49.932363] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:12.894 BaseBdev2 00:13:12.894 10:20:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:12.894 10:20:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:12.894 10:20:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:12.894 10:20:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:12.894 10:20:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:12.894 10:20:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:12.894 10:20:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.153 10:20:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:13.412 [ 00:13:13.412 { 00:13:13.412 "name": "BaseBdev2", 00:13:13.412 "aliases": [ 00:13:13.412 "5c6e77fc-726e-462f-8495-eae75c7c9436" 00:13:13.412 ], 00:13:13.412 "product_name": "Malloc disk", 00:13:13.412 "block_size": 512, 00:13:13.412 "num_blocks": 65536, 00:13:13.412 "uuid": "5c6e77fc-726e-462f-8495-eae75c7c9436", 00:13:13.412 "assigned_rate_limits": { 00:13:13.412 "rw_ios_per_sec": 0, 00:13:13.412 "rw_mbytes_per_sec": 0, 00:13:13.412 "r_mbytes_per_sec": 0, 00:13:13.412 "w_mbytes_per_sec": 0 00:13:13.412 }, 00:13:13.412 "claimed": true, 00:13:13.412 "claim_type": "exclusive_write", 00:13:13.412 "zoned": false, 00:13:13.412 "supported_io_types": { 00:13:13.412 "read": true, 00:13:13.412 "write": true, 00:13:13.412 "unmap": true, 00:13:13.412 "flush": true, 00:13:13.412 "reset": true, 00:13:13.412 "nvme_admin": false, 00:13:13.412 "nvme_io": false, 00:13:13.412 "nvme_io_md": false, 00:13:13.412 "write_zeroes": true, 00:13:13.412 "zcopy": true, 00:13:13.412 "get_zone_info": false, 00:13:13.412 "zone_management": false, 00:13:13.412 "zone_append": false, 00:13:13.412 "compare": false, 00:13:13.412 "compare_and_write": false, 00:13:13.412 "abort": true, 00:13:13.412 "seek_hole": false, 00:13:13.412 "seek_data": false, 00:13:13.412 "copy": true, 00:13:13.412 "nvme_iov_md": false 00:13:13.412 }, 00:13:13.412 "memory_domains": [ 00:13:13.412 { 00:13:13.412 "dma_device_id": "system", 00:13:13.412 "dma_device_type": 1 00:13:13.412 }, 00:13:13.412 { 00:13:13.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.412 "dma_device_type": 2 00:13:13.412 } 00:13:13.412 ], 00:13:13.412 "driver_specific": {} 00:13:13.412 } 00:13:13.412 ] 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.412 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.671 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.671 "name": "Existed_Raid", 00:13:13.671 "uuid": "5506a06b-2f50-409f-8d1d-0fb50dad9abf", 00:13:13.671 "strip_size_kb": 0, 00:13:13.671 "state": "online", 00:13:13.671 "raid_level": "raid1", 00:13:13.671 "superblock": false, 00:13:13.671 "num_base_bdevs": 2, 00:13:13.671 "num_base_bdevs_discovered": 2, 00:13:13.671 "num_base_bdevs_operational": 2, 00:13:13.671 "base_bdevs_list": [ 00:13:13.671 { 00:13:13.671 "name": "BaseBdev1", 00:13:13.671 "uuid": "2722f6d7-836b-48e0-a485-9ad4db540815", 00:13:13.671 "is_configured": true, 00:13:13.671 "data_offset": 0, 00:13:13.671 "data_size": 65536 00:13:13.671 }, 00:13:13.671 { 00:13:13.671 "name": "BaseBdev2", 00:13:13.671 "uuid": "5c6e77fc-726e-462f-8495-eae75c7c9436", 00:13:13.671 "is_configured": true, 00:13:13.671 "data_offset": 0, 00:13:13.671 "data_size": 65536 00:13:13.671 } 00:13:13.671 ] 00:13:13.671 }' 00:13:13.671 10:20:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.671 10:20:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:14.239 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:14.498 [2024-07-15 10:20:51.524348] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:14.498 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:14.498 "name": "Existed_Raid", 00:13:14.498 "aliases": [ 00:13:14.498 "5506a06b-2f50-409f-8d1d-0fb50dad9abf" 00:13:14.498 ], 00:13:14.498 "product_name": "Raid Volume", 00:13:14.498 "block_size": 512, 00:13:14.498 "num_blocks": 65536, 00:13:14.498 "uuid": "5506a06b-2f50-409f-8d1d-0fb50dad9abf", 00:13:14.498 "assigned_rate_limits": { 00:13:14.498 "rw_ios_per_sec": 0, 00:13:14.498 "rw_mbytes_per_sec": 0, 00:13:14.498 "r_mbytes_per_sec": 0, 00:13:14.498 "w_mbytes_per_sec": 0 00:13:14.498 }, 00:13:14.498 "claimed": false, 00:13:14.498 "zoned": false, 00:13:14.498 "supported_io_types": { 00:13:14.498 "read": true, 00:13:14.498 "write": true, 00:13:14.498 "unmap": false, 00:13:14.498 "flush": false, 00:13:14.498 "reset": true, 00:13:14.498 "nvme_admin": false, 00:13:14.498 "nvme_io": false, 00:13:14.498 "nvme_io_md": false, 00:13:14.498 "write_zeroes": true, 00:13:14.498 "zcopy": false, 00:13:14.498 "get_zone_info": false, 00:13:14.498 "zone_management": false, 00:13:14.498 "zone_append": false, 00:13:14.498 "compare": false, 00:13:14.498 "compare_and_write": false, 00:13:14.498 "abort": false, 00:13:14.498 "seek_hole": false, 00:13:14.498 "seek_data": false, 00:13:14.498 "copy": false, 00:13:14.498 "nvme_iov_md": false 00:13:14.498 }, 00:13:14.498 "memory_domains": [ 00:13:14.498 { 00:13:14.498 "dma_device_id": "system", 00:13:14.498 "dma_device_type": 1 00:13:14.498 }, 00:13:14.498 { 00:13:14.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.498 "dma_device_type": 2 00:13:14.498 }, 00:13:14.498 { 00:13:14.498 "dma_device_id": "system", 00:13:14.498 "dma_device_type": 1 00:13:14.498 }, 00:13:14.498 { 00:13:14.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.498 "dma_device_type": 2 00:13:14.498 } 00:13:14.498 ], 00:13:14.498 "driver_specific": { 00:13:14.498 "raid": { 00:13:14.498 "uuid": "5506a06b-2f50-409f-8d1d-0fb50dad9abf", 00:13:14.498 "strip_size_kb": 0, 00:13:14.498 "state": "online", 00:13:14.498 "raid_level": "raid1", 00:13:14.498 "superblock": false, 00:13:14.499 "num_base_bdevs": 2, 00:13:14.499 "num_base_bdevs_discovered": 2, 00:13:14.499 "num_base_bdevs_operational": 2, 00:13:14.499 "base_bdevs_list": [ 00:13:14.499 { 00:13:14.499 "name": "BaseBdev1", 00:13:14.499 "uuid": "2722f6d7-836b-48e0-a485-9ad4db540815", 00:13:14.499 "is_configured": true, 00:13:14.499 "data_offset": 0, 00:13:14.499 "data_size": 65536 00:13:14.499 }, 00:13:14.499 { 00:13:14.499 "name": "BaseBdev2", 00:13:14.499 "uuid": "5c6e77fc-726e-462f-8495-eae75c7c9436", 00:13:14.499 "is_configured": true, 00:13:14.499 "data_offset": 0, 00:13:14.499 "data_size": 65536 00:13:14.499 } 00:13:14.499 ] 00:13:14.499 } 00:13:14.499 } 00:13:14.499 }' 00:13:14.499 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:14.499 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:14.499 BaseBdev2' 00:13:14.499 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:14.499 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:14.499 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:14.758 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:14.758 "name": "BaseBdev1", 00:13:14.758 "aliases": [ 00:13:14.758 "2722f6d7-836b-48e0-a485-9ad4db540815" 00:13:14.758 ], 00:13:14.758 "product_name": "Malloc disk", 00:13:14.758 "block_size": 512, 00:13:14.758 "num_blocks": 65536, 00:13:14.758 "uuid": "2722f6d7-836b-48e0-a485-9ad4db540815", 00:13:14.758 "assigned_rate_limits": { 00:13:14.758 "rw_ios_per_sec": 0, 00:13:14.758 "rw_mbytes_per_sec": 0, 00:13:14.758 "r_mbytes_per_sec": 0, 00:13:14.758 "w_mbytes_per_sec": 0 00:13:14.758 }, 00:13:14.758 "claimed": true, 00:13:14.758 "claim_type": "exclusive_write", 00:13:14.758 "zoned": false, 00:13:14.758 "supported_io_types": { 00:13:14.758 "read": true, 00:13:14.758 "write": true, 00:13:14.758 "unmap": true, 00:13:14.758 "flush": true, 00:13:14.758 "reset": true, 00:13:14.758 "nvme_admin": false, 00:13:14.758 "nvme_io": false, 00:13:14.758 "nvme_io_md": false, 00:13:14.758 "write_zeroes": true, 00:13:14.758 "zcopy": true, 00:13:14.758 "get_zone_info": false, 00:13:14.758 "zone_management": false, 00:13:14.758 "zone_append": false, 00:13:14.758 "compare": false, 00:13:14.758 "compare_and_write": false, 00:13:14.758 "abort": true, 00:13:14.758 "seek_hole": false, 00:13:14.758 "seek_data": false, 00:13:14.758 "copy": true, 00:13:14.758 "nvme_iov_md": false 00:13:14.758 }, 00:13:14.758 "memory_domains": [ 00:13:14.758 { 00:13:14.758 "dma_device_id": "system", 00:13:14.758 "dma_device_type": 1 00:13:14.758 }, 00:13:14.758 { 00:13:14.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.758 "dma_device_type": 2 00:13:14.758 } 00:13:14.758 ], 00:13:14.758 "driver_specific": {} 00:13:14.758 }' 00:13:14.758 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.758 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:14.758 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:14.758 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.017 10:20:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:15.017 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:15.275 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:15.275 "name": "BaseBdev2", 00:13:15.275 "aliases": [ 00:13:15.275 "5c6e77fc-726e-462f-8495-eae75c7c9436" 00:13:15.275 ], 00:13:15.275 "product_name": "Malloc disk", 00:13:15.275 "block_size": 512, 00:13:15.275 "num_blocks": 65536, 00:13:15.275 "uuid": "5c6e77fc-726e-462f-8495-eae75c7c9436", 00:13:15.275 "assigned_rate_limits": { 00:13:15.275 "rw_ios_per_sec": 0, 00:13:15.275 "rw_mbytes_per_sec": 0, 00:13:15.275 "r_mbytes_per_sec": 0, 00:13:15.275 "w_mbytes_per_sec": 0 00:13:15.275 }, 00:13:15.275 "claimed": true, 00:13:15.275 "claim_type": "exclusive_write", 00:13:15.275 "zoned": false, 00:13:15.275 "supported_io_types": { 00:13:15.275 "read": true, 00:13:15.275 "write": true, 00:13:15.275 "unmap": true, 00:13:15.275 "flush": true, 00:13:15.275 "reset": true, 00:13:15.275 "nvme_admin": false, 00:13:15.275 "nvme_io": false, 00:13:15.275 "nvme_io_md": false, 00:13:15.275 "write_zeroes": true, 00:13:15.275 "zcopy": true, 00:13:15.275 "get_zone_info": false, 00:13:15.275 "zone_management": false, 00:13:15.275 "zone_append": false, 00:13:15.275 "compare": false, 00:13:15.275 "compare_and_write": false, 00:13:15.275 "abort": true, 00:13:15.275 "seek_hole": false, 00:13:15.275 "seek_data": false, 00:13:15.275 "copy": true, 00:13:15.275 "nvme_iov_md": false 00:13:15.275 }, 00:13:15.275 "memory_domains": [ 00:13:15.275 { 00:13:15.275 "dma_device_id": "system", 00:13:15.275 "dma_device_type": 1 00:13:15.275 }, 00:13:15.275 { 00:13:15.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.275 "dma_device_type": 2 00:13:15.275 } 00:13:15.275 ], 00:13:15.275 "driver_specific": {} 00:13:15.275 }' 00:13:15.275 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.275 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.534 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:15.792 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:15.792 10:20:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:16.050 [2024-07-15 10:20:52.992051] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.050 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.308 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.308 "name": "Existed_Raid", 00:13:16.308 "uuid": "5506a06b-2f50-409f-8d1d-0fb50dad9abf", 00:13:16.308 "strip_size_kb": 0, 00:13:16.308 "state": "online", 00:13:16.308 "raid_level": "raid1", 00:13:16.308 "superblock": false, 00:13:16.308 "num_base_bdevs": 2, 00:13:16.308 "num_base_bdevs_discovered": 1, 00:13:16.308 "num_base_bdevs_operational": 1, 00:13:16.308 "base_bdevs_list": [ 00:13:16.308 { 00:13:16.308 "name": null, 00:13:16.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.308 "is_configured": false, 00:13:16.308 "data_offset": 0, 00:13:16.308 "data_size": 65536 00:13:16.308 }, 00:13:16.308 { 00:13:16.308 "name": "BaseBdev2", 00:13:16.308 "uuid": "5c6e77fc-726e-462f-8495-eae75c7c9436", 00:13:16.308 "is_configured": true, 00:13:16.308 "data_offset": 0, 00:13:16.308 "data_size": 65536 00:13:16.308 } 00:13:16.308 ] 00:13:16.308 }' 00:13:16.308 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.308 10:20:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.873 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:16.873 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:16.873 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:16.873 10:20:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.130 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:17.130 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:17.130 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:17.389 [2024-07-15 10:20:54.329674] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:17.389 [2024-07-15 10:20:54.329766] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:17.389 [2024-07-15 10:20:54.342220] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:17.389 [2024-07-15 10:20:54.342260] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:17.389 [2024-07-15 10:20:54.342274] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb10000 name Existed_Raid, state offline 00:13:17.389 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:17.389 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:17.389 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.389 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 486805 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 486805 ']' 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 486805 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 486805 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:17.648 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 486805' 00:13:17.648 killing process with pid 486805 00:13:17.649 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 486805 00:13:17.649 [2024-07-15 10:20:54.649406] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:17.649 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 486805 00:13:17.649 [2024-07-15 10:20:54.650415] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:17.908 00:13:17.908 real 0m10.519s 00:13:17.908 user 0m18.655s 00:13:17.908 sys 0m1.991s 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.908 ************************************ 00:13:17.908 END TEST raid_state_function_test 00:13:17.908 ************************************ 00:13:17.908 10:20:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:17.908 10:20:54 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:13:17.908 10:20:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:17.908 10:20:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:17.908 10:20:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:17.908 ************************************ 00:13:17.908 START TEST raid_state_function_test_sb 00:13:17.908 ************************************ 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=488425 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 488425' 00:13:17.908 Process raid pid: 488425 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 488425 /var/tmp/spdk-raid.sock 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 488425 ']' 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:17.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:17.908 10:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.908 [2024-07-15 10:20:55.041705] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:13:17.908 [2024-07-15 10:20:55.041837] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:18.166 [2024-07-15 10:20:55.237401] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:18.167 [2024-07-15 10:20:55.338179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:18.424 [2024-07-15 10:20:55.405917] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.424 [2024-07-15 10:20:55.405972] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:18.990 10:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:18.990 10:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:18.990 10:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:18.990 [2024-07-15 10:20:56.156475] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:18.990 [2024-07-15 10:20:56.156519] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:18.990 [2024-07-15 10:20:56.156530] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:18.990 [2024-07-15 10:20:56.156541] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.990 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.247 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.247 "name": "Existed_Raid", 00:13:19.247 "uuid": "c375ef5c-dee9-4e91-87f9-22b499f8aab3", 00:13:19.247 "strip_size_kb": 0, 00:13:19.247 "state": "configuring", 00:13:19.247 "raid_level": "raid1", 00:13:19.247 "superblock": true, 00:13:19.247 "num_base_bdevs": 2, 00:13:19.247 "num_base_bdevs_discovered": 0, 00:13:19.247 "num_base_bdevs_operational": 2, 00:13:19.247 "base_bdevs_list": [ 00:13:19.247 { 00:13:19.247 "name": "BaseBdev1", 00:13:19.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.247 "is_configured": false, 00:13:19.247 "data_offset": 0, 00:13:19.247 "data_size": 0 00:13:19.247 }, 00:13:19.247 { 00:13:19.247 "name": "BaseBdev2", 00:13:19.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.247 "is_configured": false, 00:13:19.247 "data_offset": 0, 00:13:19.247 "data_size": 0 00:13:19.247 } 00:13:19.247 ] 00:13:19.247 }' 00:13:19.247 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.247 10:20:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:19.810 10:20:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:20.067 [2024-07-15 10:20:57.223150] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:20.067 [2024-07-15 10:20:57.223185] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1891a80 name Existed_Raid, state configuring 00:13:20.067 10:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:20.324 [2024-07-15 10:20:57.467818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:20.324 [2024-07-15 10:20:57.467854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:20.324 [2024-07-15 10:20:57.467864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:20.324 [2024-07-15 10:20:57.467875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:20.324 10:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:20.582 [2024-07-15 10:20:57.726404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:20.582 BaseBdev1 00:13:20.582 10:20:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:20.582 10:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:20.582 10:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:20.582 10:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:20.582 10:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:20.582 10:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:20.582 10:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:20.838 10:20:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:21.095 [ 00:13:21.095 { 00:13:21.095 "name": "BaseBdev1", 00:13:21.095 "aliases": [ 00:13:21.095 "06faaa39-9e2e-4840-a6e4-73924ddf15a2" 00:13:21.095 ], 00:13:21.095 "product_name": "Malloc disk", 00:13:21.095 "block_size": 512, 00:13:21.095 "num_blocks": 65536, 00:13:21.095 "uuid": "06faaa39-9e2e-4840-a6e4-73924ddf15a2", 00:13:21.095 "assigned_rate_limits": { 00:13:21.095 "rw_ios_per_sec": 0, 00:13:21.095 "rw_mbytes_per_sec": 0, 00:13:21.095 "r_mbytes_per_sec": 0, 00:13:21.095 "w_mbytes_per_sec": 0 00:13:21.095 }, 00:13:21.095 "claimed": true, 00:13:21.095 "claim_type": "exclusive_write", 00:13:21.095 "zoned": false, 00:13:21.095 "supported_io_types": { 00:13:21.095 "read": true, 00:13:21.095 "write": true, 00:13:21.095 "unmap": true, 00:13:21.095 "flush": true, 00:13:21.095 "reset": true, 00:13:21.095 "nvme_admin": false, 00:13:21.095 "nvme_io": false, 00:13:21.095 "nvme_io_md": false, 00:13:21.095 "write_zeroes": true, 00:13:21.095 "zcopy": true, 00:13:21.095 "get_zone_info": false, 00:13:21.095 "zone_management": false, 00:13:21.095 "zone_append": false, 00:13:21.095 "compare": false, 00:13:21.095 "compare_and_write": false, 00:13:21.095 "abort": true, 00:13:21.095 "seek_hole": false, 00:13:21.095 "seek_data": false, 00:13:21.095 "copy": true, 00:13:21.095 "nvme_iov_md": false 00:13:21.095 }, 00:13:21.095 "memory_domains": [ 00:13:21.095 { 00:13:21.095 "dma_device_id": "system", 00:13:21.095 "dma_device_type": 1 00:13:21.095 }, 00:13:21.095 { 00:13:21.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.095 "dma_device_type": 2 00:13:21.095 } 00:13:21.095 ], 00:13:21.095 "driver_specific": {} 00:13:21.095 } 00:13:21.095 ] 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.095 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.352 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.352 "name": "Existed_Raid", 00:13:21.352 "uuid": "9a6c565a-21f1-40cb-8dc2-5d10b21a5775", 00:13:21.352 "strip_size_kb": 0, 00:13:21.352 "state": "configuring", 00:13:21.352 "raid_level": "raid1", 00:13:21.352 "superblock": true, 00:13:21.352 "num_base_bdevs": 2, 00:13:21.352 "num_base_bdevs_discovered": 1, 00:13:21.352 "num_base_bdevs_operational": 2, 00:13:21.352 "base_bdevs_list": [ 00:13:21.352 { 00:13:21.352 "name": "BaseBdev1", 00:13:21.352 "uuid": "06faaa39-9e2e-4840-a6e4-73924ddf15a2", 00:13:21.352 "is_configured": true, 00:13:21.352 "data_offset": 2048, 00:13:21.352 "data_size": 63488 00:13:21.352 }, 00:13:21.352 { 00:13:21.352 "name": "BaseBdev2", 00:13:21.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.352 "is_configured": false, 00:13:21.352 "data_offset": 0, 00:13:21.352 "data_size": 0 00:13:21.352 } 00:13:21.352 ] 00:13:21.352 }' 00:13:21.352 10:20:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.352 10:20:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:21.918 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:22.176 [2024-07-15 10:20:59.274653] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:22.176 [2024-07-15 10:20:59.274689] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1891350 name Existed_Raid, state configuring 00:13:22.176 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:22.434 [2024-07-15 10:20:59.519337] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:22.434 [2024-07-15 10:20:59.520840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:22.434 [2024-07-15 10:20:59.520873] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:22.434 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.692 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:22.692 "name": "Existed_Raid", 00:13:22.692 "uuid": "12473803-7f5b-40c2-aa30-aebe4e44580d", 00:13:22.692 "strip_size_kb": 0, 00:13:22.692 "state": "configuring", 00:13:22.692 "raid_level": "raid1", 00:13:22.692 "superblock": true, 00:13:22.692 "num_base_bdevs": 2, 00:13:22.692 "num_base_bdevs_discovered": 1, 00:13:22.692 "num_base_bdevs_operational": 2, 00:13:22.692 "base_bdevs_list": [ 00:13:22.692 { 00:13:22.692 "name": "BaseBdev1", 00:13:22.692 "uuid": "06faaa39-9e2e-4840-a6e4-73924ddf15a2", 00:13:22.692 "is_configured": true, 00:13:22.692 "data_offset": 2048, 00:13:22.692 "data_size": 63488 00:13:22.692 }, 00:13:22.692 { 00:13:22.692 "name": "BaseBdev2", 00:13:22.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:22.692 "is_configured": false, 00:13:22.692 "data_offset": 0, 00:13:22.692 "data_size": 0 00:13:22.692 } 00:13:22.692 ] 00:13:22.692 }' 00:13:22.692 10:20:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:22.692 10:20:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:23.258 10:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:23.552 [2024-07-15 10:21:00.593620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:23.552 [2024-07-15 10:21:00.593775] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1892000 00:13:23.552 [2024-07-15 10:21:00.593789] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:23.552 [2024-07-15 10:21:00.593965] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17ac0c0 00:13:23.552 [2024-07-15 10:21:00.594087] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1892000 00:13:23.552 [2024-07-15 10:21:00.594098] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1892000 00:13:23.552 [2024-07-15 10:21:00.594190] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:23.552 BaseBdev2 00:13:23.552 10:21:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:23.552 10:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:23.552 10:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:23.552 10:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:23.552 10:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:23.552 10:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:23.552 10:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:23.811 10:21:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:24.071 [ 00:13:24.071 { 00:13:24.071 "name": "BaseBdev2", 00:13:24.071 "aliases": [ 00:13:24.071 "720de466-ebe8-4cb0-9ed6-d820c468005e" 00:13:24.071 ], 00:13:24.071 "product_name": "Malloc disk", 00:13:24.071 "block_size": 512, 00:13:24.071 "num_blocks": 65536, 00:13:24.071 "uuid": "720de466-ebe8-4cb0-9ed6-d820c468005e", 00:13:24.071 "assigned_rate_limits": { 00:13:24.071 "rw_ios_per_sec": 0, 00:13:24.071 "rw_mbytes_per_sec": 0, 00:13:24.071 "r_mbytes_per_sec": 0, 00:13:24.071 "w_mbytes_per_sec": 0 00:13:24.071 }, 00:13:24.071 "claimed": true, 00:13:24.071 "claim_type": "exclusive_write", 00:13:24.071 "zoned": false, 00:13:24.071 "supported_io_types": { 00:13:24.071 "read": true, 00:13:24.071 "write": true, 00:13:24.071 "unmap": true, 00:13:24.071 "flush": true, 00:13:24.071 "reset": true, 00:13:24.071 "nvme_admin": false, 00:13:24.071 "nvme_io": false, 00:13:24.071 "nvme_io_md": false, 00:13:24.071 "write_zeroes": true, 00:13:24.071 "zcopy": true, 00:13:24.071 "get_zone_info": false, 00:13:24.071 "zone_management": false, 00:13:24.071 "zone_append": false, 00:13:24.071 "compare": false, 00:13:24.071 "compare_and_write": false, 00:13:24.071 "abort": true, 00:13:24.071 "seek_hole": false, 00:13:24.071 "seek_data": false, 00:13:24.071 "copy": true, 00:13:24.071 "nvme_iov_md": false 00:13:24.071 }, 00:13:24.071 "memory_domains": [ 00:13:24.071 { 00:13:24.071 "dma_device_id": "system", 00:13:24.071 "dma_device_type": 1 00:13:24.071 }, 00:13:24.071 { 00:13:24.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.071 "dma_device_type": 2 00:13:24.071 } 00:13:24.071 ], 00:13:24.071 "driver_specific": {} 00:13:24.071 } 00:13:24.071 ] 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.071 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.330 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.330 "name": "Existed_Raid", 00:13:24.330 "uuid": "12473803-7f5b-40c2-aa30-aebe4e44580d", 00:13:24.330 "strip_size_kb": 0, 00:13:24.330 "state": "online", 00:13:24.330 "raid_level": "raid1", 00:13:24.330 "superblock": true, 00:13:24.330 "num_base_bdevs": 2, 00:13:24.330 "num_base_bdevs_discovered": 2, 00:13:24.330 "num_base_bdevs_operational": 2, 00:13:24.330 "base_bdevs_list": [ 00:13:24.330 { 00:13:24.330 "name": "BaseBdev1", 00:13:24.330 "uuid": "06faaa39-9e2e-4840-a6e4-73924ddf15a2", 00:13:24.330 "is_configured": true, 00:13:24.330 "data_offset": 2048, 00:13:24.330 "data_size": 63488 00:13:24.330 }, 00:13:24.330 { 00:13:24.330 "name": "BaseBdev2", 00:13:24.330 "uuid": "720de466-ebe8-4cb0-9ed6-d820c468005e", 00:13:24.330 "is_configured": true, 00:13:24.330 "data_offset": 2048, 00:13:24.330 "data_size": 63488 00:13:24.330 } 00:13:24.330 ] 00:13:24.330 }' 00:13:24.331 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.331 10:21:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:24.898 10:21:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:25.157 [2024-07-15 10:21:02.198333] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:25.157 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:25.157 "name": "Existed_Raid", 00:13:25.157 "aliases": [ 00:13:25.157 "12473803-7f5b-40c2-aa30-aebe4e44580d" 00:13:25.157 ], 00:13:25.157 "product_name": "Raid Volume", 00:13:25.157 "block_size": 512, 00:13:25.157 "num_blocks": 63488, 00:13:25.157 "uuid": "12473803-7f5b-40c2-aa30-aebe4e44580d", 00:13:25.157 "assigned_rate_limits": { 00:13:25.157 "rw_ios_per_sec": 0, 00:13:25.157 "rw_mbytes_per_sec": 0, 00:13:25.157 "r_mbytes_per_sec": 0, 00:13:25.157 "w_mbytes_per_sec": 0 00:13:25.157 }, 00:13:25.157 "claimed": false, 00:13:25.157 "zoned": false, 00:13:25.157 "supported_io_types": { 00:13:25.157 "read": true, 00:13:25.157 "write": true, 00:13:25.157 "unmap": false, 00:13:25.157 "flush": false, 00:13:25.157 "reset": true, 00:13:25.157 "nvme_admin": false, 00:13:25.157 "nvme_io": false, 00:13:25.157 "nvme_io_md": false, 00:13:25.157 "write_zeroes": true, 00:13:25.157 "zcopy": false, 00:13:25.157 "get_zone_info": false, 00:13:25.157 "zone_management": false, 00:13:25.157 "zone_append": false, 00:13:25.157 "compare": false, 00:13:25.157 "compare_and_write": false, 00:13:25.157 "abort": false, 00:13:25.157 "seek_hole": false, 00:13:25.157 "seek_data": false, 00:13:25.157 "copy": false, 00:13:25.157 "nvme_iov_md": false 00:13:25.157 }, 00:13:25.157 "memory_domains": [ 00:13:25.157 { 00:13:25.157 "dma_device_id": "system", 00:13:25.157 "dma_device_type": 1 00:13:25.157 }, 00:13:25.157 { 00:13:25.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.157 "dma_device_type": 2 00:13:25.157 }, 00:13:25.157 { 00:13:25.157 "dma_device_id": "system", 00:13:25.157 "dma_device_type": 1 00:13:25.157 }, 00:13:25.157 { 00:13:25.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.157 "dma_device_type": 2 00:13:25.157 } 00:13:25.157 ], 00:13:25.157 "driver_specific": { 00:13:25.157 "raid": { 00:13:25.157 "uuid": "12473803-7f5b-40c2-aa30-aebe4e44580d", 00:13:25.157 "strip_size_kb": 0, 00:13:25.157 "state": "online", 00:13:25.157 "raid_level": "raid1", 00:13:25.157 "superblock": true, 00:13:25.157 "num_base_bdevs": 2, 00:13:25.157 "num_base_bdevs_discovered": 2, 00:13:25.157 "num_base_bdevs_operational": 2, 00:13:25.157 "base_bdevs_list": [ 00:13:25.157 { 00:13:25.157 "name": "BaseBdev1", 00:13:25.157 "uuid": "06faaa39-9e2e-4840-a6e4-73924ddf15a2", 00:13:25.157 "is_configured": true, 00:13:25.157 "data_offset": 2048, 00:13:25.157 "data_size": 63488 00:13:25.157 }, 00:13:25.157 { 00:13:25.157 "name": "BaseBdev2", 00:13:25.157 "uuid": "720de466-ebe8-4cb0-9ed6-d820c468005e", 00:13:25.157 "is_configured": true, 00:13:25.157 "data_offset": 2048, 00:13:25.157 "data_size": 63488 00:13:25.157 } 00:13:25.157 ] 00:13:25.157 } 00:13:25.157 } 00:13:25.157 }' 00:13:25.157 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:25.157 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:25.157 BaseBdev2' 00:13:25.157 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:25.157 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:25.157 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:25.416 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:25.416 "name": "BaseBdev1", 00:13:25.416 "aliases": [ 00:13:25.416 "06faaa39-9e2e-4840-a6e4-73924ddf15a2" 00:13:25.416 ], 00:13:25.416 "product_name": "Malloc disk", 00:13:25.416 "block_size": 512, 00:13:25.416 "num_blocks": 65536, 00:13:25.416 "uuid": "06faaa39-9e2e-4840-a6e4-73924ddf15a2", 00:13:25.416 "assigned_rate_limits": { 00:13:25.416 "rw_ios_per_sec": 0, 00:13:25.416 "rw_mbytes_per_sec": 0, 00:13:25.416 "r_mbytes_per_sec": 0, 00:13:25.416 "w_mbytes_per_sec": 0 00:13:25.416 }, 00:13:25.416 "claimed": true, 00:13:25.416 "claim_type": "exclusive_write", 00:13:25.416 "zoned": false, 00:13:25.416 "supported_io_types": { 00:13:25.416 "read": true, 00:13:25.416 "write": true, 00:13:25.416 "unmap": true, 00:13:25.416 "flush": true, 00:13:25.416 "reset": true, 00:13:25.416 "nvme_admin": false, 00:13:25.416 "nvme_io": false, 00:13:25.416 "nvme_io_md": false, 00:13:25.416 "write_zeroes": true, 00:13:25.416 "zcopy": true, 00:13:25.416 "get_zone_info": false, 00:13:25.416 "zone_management": false, 00:13:25.416 "zone_append": false, 00:13:25.416 "compare": false, 00:13:25.416 "compare_and_write": false, 00:13:25.416 "abort": true, 00:13:25.416 "seek_hole": false, 00:13:25.416 "seek_data": false, 00:13:25.416 "copy": true, 00:13:25.416 "nvme_iov_md": false 00:13:25.416 }, 00:13:25.416 "memory_domains": [ 00:13:25.416 { 00:13:25.416 "dma_device_id": "system", 00:13:25.416 "dma_device_type": 1 00:13:25.416 }, 00:13:25.416 { 00:13:25.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.416 "dma_device_type": 2 00:13:25.416 } 00:13:25.416 ], 00:13:25.416 "driver_specific": {} 00:13:25.416 }' 00:13:25.416 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.416 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.416 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:25.416 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:25.675 10:21:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:25.934 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:25.934 "name": "BaseBdev2", 00:13:25.934 "aliases": [ 00:13:25.934 "720de466-ebe8-4cb0-9ed6-d820c468005e" 00:13:25.934 ], 00:13:25.934 "product_name": "Malloc disk", 00:13:25.934 "block_size": 512, 00:13:25.934 "num_blocks": 65536, 00:13:25.934 "uuid": "720de466-ebe8-4cb0-9ed6-d820c468005e", 00:13:25.934 "assigned_rate_limits": { 00:13:25.934 "rw_ios_per_sec": 0, 00:13:25.934 "rw_mbytes_per_sec": 0, 00:13:25.934 "r_mbytes_per_sec": 0, 00:13:25.934 "w_mbytes_per_sec": 0 00:13:25.934 }, 00:13:25.934 "claimed": true, 00:13:25.934 "claim_type": "exclusive_write", 00:13:25.934 "zoned": false, 00:13:25.934 "supported_io_types": { 00:13:25.934 "read": true, 00:13:25.934 "write": true, 00:13:25.934 "unmap": true, 00:13:25.934 "flush": true, 00:13:25.934 "reset": true, 00:13:25.934 "nvme_admin": false, 00:13:25.934 "nvme_io": false, 00:13:25.934 "nvme_io_md": false, 00:13:25.934 "write_zeroes": true, 00:13:25.934 "zcopy": true, 00:13:25.934 "get_zone_info": false, 00:13:25.934 "zone_management": false, 00:13:25.934 "zone_append": false, 00:13:25.934 "compare": false, 00:13:25.934 "compare_and_write": false, 00:13:25.934 "abort": true, 00:13:25.934 "seek_hole": false, 00:13:25.934 "seek_data": false, 00:13:25.934 "copy": true, 00:13:25.934 "nvme_iov_md": false 00:13:25.934 }, 00:13:25.934 "memory_domains": [ 00:13:25.934 { 00:13:25.934 "dma_device_id": "system", 00:13:25.934 "dma_device_type": 1 00:13:25.934 }, 00:13:25.934 { 00:13:25.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.934 "dma_device_type": 2 00:13:25.934 } 00:13:25.934 ], 00:13:25.934 "driver_specific": {} 00:13:25.934 }' 00:13:25.934 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.934 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:26.193 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.451 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.451 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:26.451 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:26.710 [2024-07-15 10:21:03.702118] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.710 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.969 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.969 "name": "Existed_Raid", 00:13:26.969 "uuid": "12473803-7f5b-40c2-aa30-aebe4e44580d", 00:13:26.969 "strip_size_kb": 0, 00:13:26.969 "state": "online", 00:13:26.969 "raid_level": "raid1", 00:13:26.969 "superblock": true, 00:13:26.969 "num_base_bdevs": 2, 00:13:26.969 "num_base_bdevs_discovered": 1, 00:13:26.969 "num_base_bdevs_operational": 1, 00:13:26.969 "base_bdevs_list": [ 00:13:26.969 { 00:13:26.969 "name": null, 00:13:26.969 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.969 "is_configured": false, 00:13:26.969 "data_offset": 2048, 00:13:26.969 "data_size": 63488 00:13:26.969 }, 00:13:26.969 { 00:13:26.969 "name": "BaseBdev2", 00:13:26.969 "uuid": "720de466-ebe8-4cb0-9ed6-d820c468005e", 00:13:26.969 "is_configured": true, 00:13:26.969 "data_offset": 2048, 00:13:26.969 "data_size": 63488 00:13:26.969 } 00:13:26.969 ] 00:13:26.969 }' 00:13:26.969 10:21:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.969 10:21:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:27.536 10:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:27.536 10:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:27.536 10:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.536 10:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:27.794 10:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:27.794 10:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:27.794 10:21:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:28.053 [2024-07-15 10:21:05.010845] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:28.053 [2024-07-15 10:21:05.010938] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:28.053 [2024-07-15 10:21:05.022044] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.053 [2024-07-15 10:21:05.022082] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:28.053 [2024-07-15 10:21:05.022094] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1892000 name Existed_Raid, state offline 00:13:28.053 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:28.053 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:28.053 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.053 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 488425 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 488425 ']' 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 488425 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 488425 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 488425' 00:13:28.621 killing process with pid 488425 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 488425 00:13:28.621 [2024-07-15 10:21:05.603454] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:28.621 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 488425 00:13:28.621 [2024-07-15 10:21:05.604434] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:28.881 10:21:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:28.881 00:13:28.881 real 0m10.894s 00:13:28.881 user 0m19.411s 00:13:28.881 sys 0m1.975s 00:13:28.881 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:28.881 10:21:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:28.881 ************************************ 00:13:28.881 END TEST raid_state_function_test_sb 00:13:28.881 ************************************ 00:13:28.881 10:21:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:28.881 10:21:05 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:13:28.881 10:21:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:28.881 10:21:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:28.881 10:21:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:28.881 ************************************ 00:13:28.881 START TEST raid_superblock_test 00:13:28.881 ************************************ 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=490499 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 490499 /var/tmp/spdk-raid.sock 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 490499 ']' 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:28.881 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:28.881 10:21:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.881 [2024-07-15 10:21:05.950823] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:13:28.881 [2024-07-15 10:21:05.950887] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid490499 ] 00:13:29.140 [2024-07-15 10:21:06.080276] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:29.140 [2024-07-15 10:21:06.190525] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:29.140 [2024-07-15 10:21:06.254094] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.140 [2024-07-15 10:21:06.254129] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:29.399 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:29.658 malloc1 00:13:29.658 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:29.917 [2024-07-15 10:21:06.873127] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:29.917 [2024-07-15 10:21:06.873172] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:29.917 [2024-07-15 10:21:06.873193] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db0570 00:13:29.917 [2024-07-15 10:21:06.873206] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:29.917 [2024-07-15 10:21:06.875022] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:29.917 [2024-07-15 10:21:06.875051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:29.917 pt1 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:29.917 10:21:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:30.483 malloc2 00:13:30.483 10:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:30.483 [2024-07-15 10:21:07.625148] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:30.483 [2024-07-15 10:21:07.625197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.483 [2024-07-15 10:21:07.625214] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db1970 00:13:30.483 [2024-07-15 10:21:07.625228] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.483 [2024-07-15 10:21:07.626878] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.483 [2024-07-15 10:21:07.626906] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:30.483 pt2 00:13:30.483 10:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:30.483 10:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:30.483 10:21:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:31.051 [2024-07-15 10:21:08.126470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:31.051 [2024-07-15 10:21:08.127795] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:31.051 [2024-07-15 10:21:08.127953] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f54270 00:13:31.051 [2024-07-15 10:21:08.127967] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:31.051 [2024-07-15 10:21:08.128166] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da80e0 00:13:31.051 [2024-07-15 10:21:08.128313] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f54270 00:13:31.051 [2024-07-15 10:21:08.128323] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f54270 00:13:31.051 [2024-07-15 10:21:08.128423] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.051 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.052 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.052 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:31.310 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.310 "name": "raid_bdev1", 00:13:31.310 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:31.310 "strip_size_kb": 0, 00:13:31.310 "state": "online", 00:13:31.310 "raid_level": "raid1", 00:13:31.310 "superblock": true, 00:13:31.310 "num_base_bdevs": 2, 00:13:31.310 "num_base_bdevs_discovered": 2, 00:13:31.310 "num_base_bdevs_operational": 2, 00:13:31.310 "base_bdevs_list": [ 00:13:31.310 { 00:13:31.310 "name": "pt1", 00:13:31.310 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:31.310 "is_configured": true, 00:13:31.310 "data_offset": 2048, 00:13:31.310 "data_size": 63488 00:13:31.310 }, 00:13:31.310 { 00:13:31.310 "name": "pt2", 00:13:31.310 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:31.310 "is_configured": true, 00:13:31.310 "data_offset": 2048, 00:13:31.310 "data_size": 63488 00:13:31.310 } 00:13:31.310 ] 00:13:31.310 }' 00:13:31.310 10:21:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.310 10:21:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:31.878 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:32.136 [2024-07-15 10:21:09.161436] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:32.136 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:32.136 "name": "raid_bdev1", 00:13:32.136 "aliases": [ 00:13:32.136 "de9dab60-8775-443b-ae20-39b05eb936e3" 00:13:32.136 ], 00:13:32.136 "product_name": "Raid Volume", 00:13:32.136 "block_size": 512, 00:13:32.136 "num_blocks": 63488, 00:13:32.136 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:32.136 "assigned_rate_limits": { 00:13:32.136 "rw_ios_per_sec": 0, 00:13:32.136 "rw_mbytes_per_sec": 0, 00:13:32.136 "r_mbytes_per_sec": 0, 00:13:32.136 "w_mbytes_per_sec": 0 00:13:32.136 }, 00:13:32.136 "claimed": false, 00:13:32.136 "zoned": false, 00:13:32.136 "supported_io_types": { 00:13:32.136 "read": true, 00:13:32.136 "write": true, 00:13:32.136 "unmap": false, 00:13:32.136 "flush": false, 00:13:32.136 "reset": true, 00:13:32.136 "nvme_admin": false, 00:13:32.136 "nvme_io": false, 00:13:32.136 "nvme_io_md": false, 00:13:32.136 "write_zeroes": true, 00:13:32.136 "zcopy": false, 00:13:32.136 "get_zone_info": false, 00:13:32.136 "zone_management": false, 00:13:32.136 "zone_append": false, 00:13:32.136 "compare": false, 00:13:32.136 "compare_and_write": false, 00:13:32.136 "abort": false, 00:13:32.136 "seek_hole": false, 00:13:32.136 "seek_data": false, 00:13:32.136 "copy": false, 00:13:32.136 "nvme_iov_md": false 00:13:32.136 }, 00:13:32.136 "memory_domains": [ 00:13:32.136 { 00:13:32.136 "dma_device_id": "system", 00:13:32.136 "dma_device_type": 1 00:13:32.136 }, 00:13:32.136 { 00:13:32.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.136 "dma_device_type": 2 00:13:32.136 }, 00:13:32.136 { 00:13:32.136 "dma_device_id": "system", 00:13:32.136 "dma_device_type": 1 00:13:32.136 }, 00:13:32.136 { 00:13:32.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.136 "dma_device_type": 2 00:13:32.136 } 00:13:32.136 ], 00:13:32.136 "driver_specific": { 00:13:32.136 "raid": { 00:13:32.136 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:32.136 "strip_size_kb": 0, 00:13:32.136 "state": "online", 00:13:32.136 "raid_level": "raid1", 00:13:32.136 "superblock": true, 00:13:32.136 "num_base_bdevs": 2, 00:13:32.136 "num_base_bdevs_discovered": 2, 00:13:32.136 "num_base_bdevs_operational": 2, 00:13:32.136 "base_bdevs_list": [ 00:13:32.136 { 00:13:32.136 "name": "pt1", 00:13:32.136 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:32.136 "is_configured": true, 00:13:32.136 "data_offset": 2048, 00:13:32.136 "data_size": 63488 00:13:32.136 }, 00:13:32.136 { 00:13:32.136 "name": "pt2", 00:13:32.136 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:32.136 "is_configured": true, 00:13:32.136 "data_offset": 2048, 00:13:32.136 "data_size": 63488 00:13:32.136 } 00:13:32.136 ] 00:13:32.136 } 00:13:32.136 } 00:13:32.136 }' 00:13:32.136 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:32.136 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:32.136 pt2' 00:13:32.136 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.136 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:32.136 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.395 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.395 "name": "pt1", 00:13:32.395 "aliases": [ 00:13:32.395 "00000000-0000-0000-0000-000000000001" 00:13:32.395 ], 00:13:32.395 "product_name": "passthru", 00:13:32.395 "block_size": 512, 00:13:32.395 "num_blocks": 65536, 00:13:32.395 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:32.395 "assigned_rate_limits": { 00:13:32.395 "rw_ios_per_sec": 0, 00:13:32.395 "rw_mbytes_per_sec": 0, 00:13:32.395 "r_mbytes_per_sec": 0, 00:13:32.395 "w_mbytes_per_sec": 0 00:13:32.395 }, 00:13:32.395 "claimed": true, 00:13:32.395 "claim_type": "exclusive_write", 00:13:32.395 "zoned": false, 00:13:32.395 "supported_io_types": { 00:13:32.395 "read": true, 00:13:32.395 "write": true, 00:13:32.395 "unmap": true, 00:13:32.395 "flush": true, 00:13:32.395 "reset": true, 00:13:32.395 "nvme_admin": false, 00:13:32.395 "nvme_io": false, 00:13:32.395 "nvme_io_md": false, 00:13:32.395 "write_zeroes": true, 00:13:32.395 "zcopy": true, 00:13:32.395 "get_zone_info": false, 00:13:32.395 "zone_management": false, 00:13:32.395 "zone_append": false, 00:13:32.395 "compare": false, 00:13:32.395 "compare_and_write": false, 00:13:32.395 "abort": true, 00:13:32.395 "seek_hole": false, 00:13:32.395 "seek_data": false, 00:13:32.395 "copy": true, 00:13:32.395 "nvme_iov_md": false 00:13:32.395 }, 00:13:32.395 "memory_domains": [ 00:13:32.395 { 00:13:32.395 "dma_device_id": "system", 00:13:32.395 "dma_device_type": 1 00:13:32.395 }, 00:13:32.395 { 00:13:32.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.395 "dma_device_type": 2 00:13:32.395 } 00:13:32.395 ], 00:13:32.395 "driver_specific": { 00:13:32.395 "passthru": { 00:13:32.395 "name": "pt1", 00:13:32.395 "base_bdev_name": "malloc1" 00:13:32.395 } 00:13:32.395 } 00:13:32.395 }' 00:13:32.395 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.395 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.395 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.395 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:32.654 10:21:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.913 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.913 "name": "pt2", 00:13:32.913 "aliases": [ 00:13:32.913 "00000000-0000-0000-0000-000000000002" 00:13:32.913 ], 00:13:32.913 "product_name": "passthru", 00:13:32.913 "block_size": 512, 00:13:32.913 "num_blocks": 65536, 00:13:32.913 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:32.913 "assigned_rate_limits": { 00:13:32.913 "rw_ios_per_sec": 0, 00:13:32.913 "rw_mbytes_per_sec": 0, 00:13:32.913 "r_mbytes_per_sec": 0, 00:13:32.913 "w_mbytes_per_sec": 0 00:13:32.913 }, 00:13:32.913 "claimed": true, 00:13:32.913 "claim_type": "exclusive_write", 00:13:32.913 "zoned": false, 00:13:32.913 "supported_io_types": { 00:13:32.913 "read": true, 00:13:32.913 "write": true, 00:13:32.913 "unmap": true, 00:13:32.913 "flush": true, 00:13:32.913 "reset": true, 00:13:32.913 "nvme_admin": false, 00:13:32.913 "nvme_io": false, 00:13:32.913 "nvme_io_md": false, 00:13:32.913 "write_zeroes": true, 00:13:32.913 "zcopy": true, 00:13:32.913 "get_zone_info": false, 00:13:32.913 "zone_management": false, 00:13:32.913 "zone_append": false, 00:13:32.913 "compare": false, 00:13:32.913 "compare_and_write": false, 00:13:32.913 "abort": true, 00:13:32.913 "seek_hole": false, 00:13:32.913 "seek_data": false, 00:13:32.913 "copy": true, 00:13:32.913 "nvme_iov_md": false 00:13:32.913 }, 00:13:32.913 "memory_domains": [ 00:13:32.913 { 00:13:32.913 "dma_device_id": "system", 00:13:32.913 "dma_device_type": 1 00:13:32.913 }, 00:13:32.913 { 00:13:32.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.913 "dma_device_type": 2 00:13:32.913 } 00:13:32.913 ], 00:13:32.913 "driver_specific": { 00:13:32.913 "passthru": { 00:13:32.913 "name": "pt2", 00:13:32.914 "base_bdev_name": "malloc2" 00:13:32.914 } 00:13:32.914 } 00:13:32.914 }' 00:13:32.914 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.914 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.172 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:33.432 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:33.432 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:33.432 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:33.432 [2024-07-15 10:21:10.621316] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:33.691 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=de9dab60-8775-443b-ae20-39b05eb936e3 00:13:33.691 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z de9dab60-8775-443b-ae20-39b05eb936e3 ']' 00:13:33.691 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:33.691 [2024-07-15 10:21:10.869711] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:33.691 [2024-07-15 10:21:10.869735] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:33.691 [2024-07-15 10:21:10.869791] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:33.691 [2024-07-15 10:21:10.869849] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:33.691 [2024-07-15 10:21:10.869861] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f54270 name raid_bdev1, state offline 00:13:33.949 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.949 10:21:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:33.949 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:33.949 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:33.949 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:33.949 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:34.207 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:34.207 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:34.466 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:34.466 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:34.725 10:21:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:34.984 [2024-07-15 10:21:12.092904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:34.984 [2024-07-15 10:21:12.094324] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:34.984 [2024-07-15 10:21:12.094379] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:34.984 [2024-07-15 10:21:12.094419] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:34.984 [2024-07-15 10:21:12.094438] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:34.984 [2024-07-15 10:21:12.094449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f53ff0 name raid_bdev1, state configuring 00:13:34.984 request: 00:13:34.984 { 00:13:34.984 "name": "raid_bdev1", 00:13:34.984 "raid_level": "raid1", 00:13:34.984 "base_bdevs": [ 00:13:34.984 "malloc1", 00:13:34.984 "malloc2" 00:13:34.984 ], 00:13:34.984 "superblock": false, 00:13:34.984 "method": "bdev_raid_create", 00:13:34.984 "req_id": 1 00:13:34.984 } 00:13:34.984 Got JSON-RPC error response 00:13:34.984 response: 00:13:34.984 { 00:13:34.984 "code": -17, 00:13:34.984 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:34.984 } 00:13:34.984 10:21:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:34.984 10:21:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:34.984 10:21:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:34.984 10:21:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:34.984 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.984 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:35.243 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:35.243 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:35.243 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:35.501 [2024-07-15 10:21:12.590143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:35.501 [2024-07-15 10:21:12.590187] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:35.501 [2024-07-15 10:21:12.590208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db07a0 00:13:35.501 [2024-07-15 10:21:12.590221] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:35.501 [2024-07-15 10:21:12.591786] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:35.501 [2024-07-15 10:21:12.591815] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:35.501 [2024-07-15 10:21:12.591880] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:35.502 [2024-07-15 10:21:12.591906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:35.502 pt1 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.502 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:35.761 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.761 "name": "raid_bdev1", 00:13:35.761 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:35.761 "strip_size_kb": 0, 00:13:35.761 "state": "configuring", 00:13:35.761 "raid_level": "raid1", 00:13:35.761 "superblock": true, 00:13:35.761 "num_base_bdevs": 2, 00:13:35.761 "num_base_bdevs_discovered": 1, 00:13:35.761 "num_base_bdevs_operational": 2, 00:13:35.761 "base_bdevs_list": [ 00:13:35.761 { 00:13:35.761 "name": "pt1", 00:13:35.761 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:35.761 "is_configured": true, 00:13:35.761 "data_offset": 2048, 00:13:35.761 "data_size": 63488 00:13:35.761 }, 00:13:35.761 { 00:13:35.761 "name": null, 00:13:35.761 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:35.761 "is_configured": false, 00:13:35.761 "data_offset": 2048, 00:13:35.761 "data_size": 63488 00:13:35.761 } 00:13:35.761 ] 00:13:35.761 }' 00:13:35.761 10:21:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.761 10:21:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.328 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:36.328 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:36.328 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:36.328 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:36.587 [2024-07-15 10:21:13.689059] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:36.587 [2024-07-15 10:21:13.689104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:36.587 [2024-07-15 10:21:13.689124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f486f0 00:13:36.587 [2024-07-15 10:21:13.689137] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:36.587 [2024-07-15 10:21:13.689475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:36.587 [2024-07-15 10:21:13.689494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:36.587 [2024-07-15 10:21:13.689556] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:36.587 [2024-07-15 10:21:13.689575] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:36.587 [2024-07-15 10:21:13.689673] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f49590 00:13:36.587 [2024-07-15 10:21:13.689683] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:36.587 [2024-07-15 10:21:13.689849] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1daa540 00:13:36.587 [2024-07-15 10:21:13.689982] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f49590 00:13:36.587 [2024-07-15 10:21:13.689993] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f49590 00:13:36.587 [2024-07-15 10:21:13.690090] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:36.587 pt2 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.587 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:36.845 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.845 "name": "raid_bdev1", 00:13:36.845 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:36.845 "strip_size_kb": 0, 00:13:36.845 "state": "online", 00:13:36.845 "raid_level": "raid1", 00:13:36.845 "superblock": true, 00:13:36.845 "num_base_bdevs": 2, 00:13:36.846 "num_base_bdevs_discovered": 2, 00:13:36.846 "num_base_bdevs_operational": 2, 00:13:36.846 "base_bdevs_list": [ 00:13:36.846 { 00:13:36.846 "name": "pt1", 00:13:36.846 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:36.846 "is_configured": true, 00:13:36.846 "data_offset": 2048, 00:13:36.846 "data_size": 63488 00:13:36.846 }, 00:13:36.846 { 00:13:36.846 "name": "pt2", 00:13:36.846 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:36.846 "is_configured": true, 00:13:36.846 "data_offset": 2048, 00:13:36.846 "data_size": 63488 00:13:36.846 } 00:13:36.846 ] 00:13:36.846 }' 00:13:36.846 10:21:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.846 10:21:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:37.415 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:37.706 [2024-07-15 10:21:14.772175] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:37.706 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:37.706 "name": "raid_bdev1", 00:13:37.706 "aliases": [ 00:13:37.706 "de9dab60-8775-443b-ae20-39b05eb936e3" 00:13:37.706 ], 00:13:37.706 "product_name": "Raid Volume", 00:13:37.706 "block_size": 512, 00:13:37.706 "num_blocks": 63488, 00:13:37.706 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:37.706 "assigned_rate_limits": { 00:13:37.706 "rw_ios_per_sec": 0, 00:13:37.706 "rw_mbytes_per_sec": 0, 00:13:37.706 "r_mbytes_per_sec": 0, 00:13:37.706 "w_mbytes_per_sec": 0 00:13:37.706 }, 00:13:37.706 "claimed": false, 00:13:37.706 "zoned": false, 00:13:37.706 "supported_io_types": { 00:13:37.706 "read": true, 00:13:37.706 "write": true, 00:13:37.706 "unmap": false, 00:13:37.706 "flush": false, 00:13:37.706 "reset": true, 00:13:37.706 "nvme_admin": false, 00:13:37.706 "nvme_io": false, 00:13:37.706 "nvme_io_md": false, 00:13:37.706 "write_zeroes": true, 00:13:37.706 "zcopy": false, 00:13:37.706 "get_zone_info": false, 00:13:37.706 "zone_management": false, 00:13:37.706 "zone_append": false, 00:13:37.706 "compare": false, 00:13:37.706 "compare_and_write": false, 00:13:37.706 "abort": false, 00:13:37.706 "seek_hole": false, 00:13:37.706 "seek_data": false, 00:13:37.706 "copy": false, 00:13:37.706 "nvme_iov_md": false 00:13:37.706 }, 00:13:37.706 "memory_domains": [ 00:13:37.706 { 00:13:37.706 "dma_device_id": "system", 00:13:37.706 "dma_device_type": 1 00:13:37.706 }, 00:13:37.706 { 00:13:37.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.706 "dma_device_type": 2 00:13:37.706 }, 00:13:37.706 { 00:13:37.706 "dma_device_id": "system", 00:13:37.706 "dma_device_type": 1 00:13:37.706 }, 00:13:37.706 { 00:13:37.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.706 "dma_device_type": 2 00:13:37.706 } 00:13:37.706 ], 00:13:37.706 "driver_specific": { 00:13:37.706 "raid": { 00:13:37.706 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:37.706 "strip_size_kb": 0, 00:13:37.706 "state": "online", 00:13:37.706 "raid_level": "raid1", 00:13:37.706 "superblock": true, 00:13:37.706 "num_base_bdevs": 2, 00:13:37.706 "num_base_bdevs_discovered": 2, 00:13:37.706 "num_base_bdevs_operational": 2, 00:13:37.706 "base_bdevs_list": [ 00:13:37.706 { 00:13:37.706 "name": "pt1", 00:13:37.706 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:37.706 "is_configured": true, 00:13:37.706 "data_offset": 2048, 00:13:37.706 "data_size": 63488 00:13:37.706 }, 00:13:37.706 { 00:13:37.706 "name": "pt2", 00:13:37.706 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:37.706 "is_configured": true, 00:13:37.706 "data_offset": 2048, 00:13:37.706 "data_size": 63488 00:13:37.706 } 00:13:37.706 ] 00:13:37.706 } 00:13:37.706 } 00:13:37.706 }' 00:13:37.706 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:37.706 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:37.706 pt2' 00:13:37.706 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:37.706 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:37.706 10:21:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:37.964 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:37.964 "name": "pt1", 00:13:37.964 "aliases": [ 00:13:37.964 "00000000-0000-0000-0000-000000000001" 00:13:37.964 ], 00:13:37.964 "product_name": "passthru", 00:13:37.964 "block_size": 512, 00:13:37.964 "num_blocks": 65536, 00:13:37.964 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:37.964 "assigned_rate_limits": { 00:13:37.964 "rw_ios_per_sec": 0, 00:13:37.964 "rw_mbytes_per_sec": 0, 00:13:37.964 "r_mbytes_per_sec": 0, 00:13:37.964 "w_mbytes_per_sec": 0 00:13:37.964 }, 00:13:37.964 "claimed": true, 00:13:37.964 "claim_type": "exclusive_write", 00:13:37.964 "zoned": false, 00:13:37.964 "supported_io_types": { 00:13:37.965 "read": true, 00:13:37.965 "write": true, 00:13:37.965 "unmap": true, 00:13:37.965 "flush": true, 00:13:37.965 "reset": true, 00:13:37.965 "nvme_admin": false, 00:13:37.965 "nvme_io": false, 00:13:37.965 "nvme_io_md": false, 00:13:37.965 "write_zeroes": true, 00:13:37.965 "zcopy": true, 00:13:37.965 "get_zone_info": false, 00:13:37.965 "zone_management": false, 00:13:37.965 "zone_append": false, 00:13:37.965 "compare": false, 00:13:37.965 "compare_and_write": false, 00:13:37.965 "abort": true, 00:13:37.965 "seek_hole": false, 00:13:37.965 "seek_data": false, 00:13:37.965 "copy": true, 00:13:37.965 "nvme_iov_md": false 00:13:37.965 }, 00:13:37.965 "memory_domains": [ 00:13:37.965 { 00:13:37.965 "dma_device_id": "system", 00:13:37.965 "dma_device_type": 1 00:13:37.965 }, 00:13:37.965 { 00:13:37.965 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.965 "dma_device_type": 2 00:13:37.965 } 00:13:37.965 ], 00:13:37.965 "driver_specific": { 00:13:37.965 "passthru": { 00:13:37.965 "name": "pt1", 00:13:37.965 "base_bdev_name": "malloc1" 00:13:37.965 } 00:13:37.965 } 00:13:37.965 }' 00:13:37.965 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.965 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:38.223 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:38.494 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:38.494 "name": "pt2", 00:13:38.494 "aliases": [ 00:13:38.494 "00000000-0000-0000-0000-000000000002" 00:13:38.494 ], 00:13:38.494 "product_name": "passthru", 00:13:38.494 "block_size": 512, 00:13:38.494 "num_blocks": 65536, 00:13:38.494 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:38.494 "assigned_rate_limits": { 00:13:38.494 "rw_ios_per_sec": 0, 00:13:38.494 "rw_mbytes_per_sec": 0, 00:13:38.494 "r_mbytes_per_sec": 0, 00:13:38.494 "w_mbytes_per_sec": 0 00:13:38.494 }, 00:13:38.494 "claimed": true, 00:13:38.494 "claim_type": "exclusive_write", 00:13:38.494 "zoned": false, 00:13:38.494 "supported_io_types": { 00:13:38.494 "read": true, 00:13:38.494 "write": true, 00:13:38.494 "unmap": true, 00:13:38.494 "flush": true, 00:13:38.494 "reset": true, 00:13:38.494 "nvme_admin": false, 00:13:38.494 "nvme_io": false, 00:13:38.494 "nvme_io_md": false, 00:13:38.494 "write_zeroes": true, 00:13:38.494 "zcopy": true, 00:13:38.494 "get_zone_info": false, 00:13:38.494 "zone_management": false, 00:13:38.494 "zone_append": false, 00:13:38.494 "compare": false, 00:13:38.494 "compare_and_write": false, 00:13:38.494 "abort": true, 00:13:38.494 "seek_hole": false, 00:13:38.494 "seek_data": false, 00:13:38.494 "copy": true, 00:13:38.494 "nvme_iov_md": false 00:13:38.494 }, 00:13:38.494 "memory_domains": [ 00:13:38.494 { 00:13:38.494 "dma_device_id": "system", 00:13:38.494 "dma_device_type": 1 00:13:38.494 }, 00:13:38.494 { 00:13:38.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:38.494 "dma_device_type": 2 00:13:38.494 } 00:13:38.494 ], 00:13:38.494 "driver_specific": { 00:13:38.494 "passthru": { 00:13:38.494 "name": "pt2", 00:13:38.494 "base_bdev_name": "malloc2" 00:13:38.494 } 00:13:38.494 } 00:13:38.494 }' 00:13:38.494 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:38.752 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.011 10:21:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:39.011 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:39.011 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:39.011 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:39.269 [2024-07-15 10:21:16.232061] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:39.269 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' de9dab60-8775-443b-ae20-39b05eb936e3 '!=' de9dab60-8775-443b-ae20-39b05eb936e3 ']' 00:13:39.269 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:13:39.269 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:39.269 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:39.269 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:39.528 [2024-07-15 10:21:16.476471] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:39.528 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.787 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.787 "name": "raid_bdev1", 00:13:39.787 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:39.787 "strip_size_kb": 0, 00:13:39.787 "state": "online", 00:13:39.787 "raid_level": "raid1", 00:13:39.787 "superblock": true, 00:13:39.787 "num_base_bdevs": 2, 00:13:39.787 "num_base_bdevs_discovered": 1, 00:13:39.787 "num_base_bdevs_operational": 1, 00:13:39.787 "base_bdevs_list": [ 00:13:39.787 { 00:13:39.787 "name": null, 00:13:39.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.787 "is_configured": false, 00:13:39.787 "data_offset": 2048, 00:13:39.787 "data_size": 63488 00:13:39.787 }, 00:13:39.787 { 00:13:39.787 "name": "pt2", 00:13:39.787 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:39.787 "is_configured": true, 00:13:39.787 "data_offset": 2048, 00:13:39.787 "data_size": 63488 00:13:39.787 } 00:13:39.787 ] 00:13:39.787 }' 00:13:39.787 10:21:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.787 10:21:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.355 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:40.614 [2024-07-15 10:21:17.571364] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:40.614 [2024-07-15 10:21:17.571394] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:40.614 [2024-07-15 10:21:17.571450] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:40.614 [2024-07-15 10:21:17.571492] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:40.614 [2024-07-15 10:21:17.571504] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f49590 name raid_bdev1, state offline 00:13:40.614 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.614 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:13:40.873 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:13:40.873 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:13:40.873 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:13:40.873 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:40.873 10:21:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:41.132 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:41.132 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:41.132 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:13:41.132 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:41.132 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:13:41.132 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:41.132 [2024-07-15 10:21:18.309277] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:41.132 [2024-07-15 10:21:18.309325] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:41.132 [2024-07-15 10:21:18.309343] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db1160 00:13:41.132 [2024-07-15 10:21:18.309356] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:41.132 [2024-07-15 10:21:18.310956] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:41.132 [2024-07-15 10:21:18.310984] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:41.132 [2024-07-15 10:21:18.311048] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:41.132 [2024-07-15 10:21:18.311073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:41.132 [2024-07-15 10:21:18.311156] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1da7380 00:13:41.132 [2024-07-15 10:21:18.311167] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:41.132 [2024-07-15 10:21:18.311337] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da8a80 00:13:41.132 [2024-07-15 10:21:18.311457] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1da7380 00:13:41.132 [2024-07-15 10:21:18.311467] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1da7380 00:13:41.132 [2024-07-15 10:21:18.311561] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:41.132 pt2 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.391 "name": "raid_bdev1", 00:13:41.391 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:41.391 "strip_size_kb": 0, 00:13:41.391 "state": "online", 00:13:41.391 "raid_level": "raid1", 00:13:41.391 "superblock": true, 00:13:41.391 "num_base_bdevs": 2, 00:13:41.391 "num_base_bdevs_discovered": 1, 00:13:41.391 "num_base_bdevs_operational": 1, 00:13:41.391 "base_bdevs_list": [ 00:13:41.391 { 00:13:41.391 "name": null, 00:13:41.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.391 "is_configured": false, 00:13:41.391 "data_offset": 2048, 00:13:41.391 "data_size": 63488 00:13:41.391 }, 00:13:41.391 { 00:13:41.391 "name": "pt2", 00:13:41.391 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:41.391 "is_configured": true, 00:13:41.391 "data_offset": 2048, 00:13:41.391 "data_size": 63488 00:13:41.391 } 00:13:41.391 ] 00:13:41.391 }' 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.391 10:21:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.329 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:42.329 [2024-07-15 10:21:19.396141] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:42.329 [2024-07-15 10:21:19.396170] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:42.329 [2024-07-15 10:21:19.396220] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:42.329 [2024-07-15 10:21:19.396265] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:42.329 [2024-07-15 10:21:19.396278] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1da7380 name raid_bdev1, state offline 00:13:42.329 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.329 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:13:42.587 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:13:42.587 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:13:42.587 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:13:42.587 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:42.846 [2024-07-15 10:21:19.893441] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:42.846 [2024-07-15 10:21:19.893482] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:42.846 [2024-07-15 10:21:19.893499] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f53520 00:13:42.846 [2024-07-15 10:21:19.893512] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:42.846 [2024-07-15 10:21:19.895104] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:42.846 [2024-07-15 10:21:19.895137] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:42.846 [2024-07-15 10:21:19.895203] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:42.846 [2024-07-15 10:21:19.895228] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:42.846 [2024-07-15 10:21:19.895323] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:42.846 [2024-07-15 10:21:19.895336] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:42.846 [2024-07-15 10:21:19.895349] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1da83f0 name raid_bdev1, state configuring 00:13:42.846 [2024-07-15 10:21:19.895372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:42.846 [2024-07-15 10:21:19.895431] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1daa2b0 00:13:42.846 [2024-07-15 10:21:19.895441] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:42.846 [2024-07-15 10:21:19.895601] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da7350 00:13:42.846 [2024-07-15 10:21:19.895722] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1daa2b0 00:13:42.846 [2024-07-15 10:21:19.895731] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1daa2b0 00:13:42.846 [2024-07-15 10:21:19.895828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:42.846 pt1 00:13:42.846 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:13:42.846 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:42.846 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:42.846 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.846 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:42.846 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:42.846 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:42.847 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.847 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.847 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.847 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.847 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.847 10:21:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:43.105 10:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:43.105 "name": "raid_bdev1", 00:13:43.105 "uuid": "de9dab60-8775-443b-ae20-39b05eb936e3", 00:13:43.105 "strip_size_kb": 0, 00:13:43.105 "state": "online", 00:13:43.105 "raid_level": "raid1", 00:13:43.105 "superblock": true, 00:13:43.105 "num_base_bdevs": 2, 00:13:43.105 "num_base_bdevs_discovered": 1, 00:13:43.105 "num_base_bdevs_operational": 1, 00:13:43.105 "base_bdevs_list": [ 00:13:43.105 { 00:13:43.105 "name": null, 00:13:43.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.105 "is_configured": false, 00:13:43.105 "data_offset": 2048, 00:13:43.105 "data_size": 63488 00:13:43.105 }, 00:13:43.105 { 00:13:43.105 "name": "pt2", 00:13:43.105 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:43.105 "is_configured": true, 00:13:43.105 "data_offset": 2048, 00:13:43.105 "data_size": 63488 00:13:43.105 } 00:13:43.105 ] 00:13:43.105 }' 00:13:43.105 10:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:43.105 10:21:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.671 10:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:43.671 10:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:43.930 10:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:13:43.930 10:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:43.930 10:21:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:13:44.189 [2024-07-15 10:21:21.161027] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' de9dab60-8775-443b-ae20-39b05eb936e3 '!=' de9dab60-8775-443b-ae20-39b05eb936e3 ']' 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 490499 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 490499 ']' 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 490499 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 490499 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 490499' 00:13:44.189 killing process with pid 490499 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 490499 00:13:44.189 [2024-07-15 10:21:21.228427] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:44.189 [2024-07-15 10:21:21.228479] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:44.189 [2024-07-15 10:21:21.228521] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:44.189 [2024-07-15 10:21:21.228533] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1daa2b0 name raid_bdev1, state offline 00:13:44.189 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 490499 00:13:44.189 [2024-07-15 10:21:21.244913] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:44.448 10:21:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:44.448 00:13:44.448 real 0m15.549s 00:13:44.448 user 0m28.634s 00:13:44.448 sys 0m2.911s 00:13:44.448 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:44.448 10:21:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.448 ************************************ 00:13:44.448 END TEST raid_superblock_test 00:13:44.448 ************************************ 00:13:44.448 10:21:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:44.448 10:21:21 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:44.448 10:21:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:44.448 10:21:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:44.448 10:21:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:44.448 ************************************ 00:13:44.448 START TEST raid_read_error_test 00:13:44.448 ************************************ 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.lRihcEPBlQ 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=492999 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 492999 /var/tmp/spdk-raid.sock 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 492999 ']' 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:44.448 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:44.448 10:21:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.448 [2024-07-15 10:21:21.595265] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:13:44.448 [2024-07-15 10:21:21.595330] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid492999 ] 00:13:44.706 [2024-07-15 10:21:21.717245] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:44.707 [2024-07-15 10:21:21.822855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:44.707 [2024-07-15 10:21:21.890170] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.707 [2024-07-15 10:21:21.890209] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:45.642 10:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:45.642 10:21:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:45.642 10:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:45.642 10:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:45.642 BaseBdev1_malloc 00:13:45.642 10:21:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:45.900 true 00:13:45.900 10:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:46.158 [2024-07-15 10:21:23.245476] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:46.158 [2024-07-15 10:21:23.245521] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:46.158 [2024-07-15 10:21:23.245542] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc780d0 00:13:46.158 [2024-07-15 10:21:23.245554] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:46.158 [2024-07-15 10:21:23.247426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:46.158 [2024-07-15 10:21:23.247454] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:46.158 BaseBdev1 00:13:46.158 10:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:46.158 10:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:46.416 BaseBdev2_malloc 00:13:46.416 10:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:46.674 true 00:13:46.674 10:21:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:46.933 [2024-07-15 10:21:23.988895] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:46.933 [2024-07-15 10:21:23.988944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:46.933 [2024-07-15 10:21:23.988965] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc7c910 00:13:46.933 [2024-07-15 10:21:23.988979] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:46.933 [2024-07-15 10:21:23.990534] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:46.933 [2024-07-15 10:21:23.990562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:46.933 BaseBdev2 00:13:46.933 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:47.191 [2024-07-15 10:21:24.233566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:47.191 [2024-07-15 10:21:24.234913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:47.191 [2024-07-15 10:21:24.235109] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc7e320 00:13:47.191 [2024-07-15 10:21:24.235122] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:47.191 [2024-07-15 10:21:24.235320] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xae5d00 00:13:47.191 [2024-07-15 10:21:24.235472] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc7e320 00:13:47.191 [2024-07-15 10:21:24.235482] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc7e320 00:13:47.191 [2024-07-15 10:21:24.235594] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.191 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:47.450 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.450 "name": "raid_bdev1", 00:13:47.450 "uuid": "2fae1d77-8a88-4057-8c1a-2b586ed3ee2b", 00:13:47.450 "strip_size_kb": 0, 00:13:47.450 "state": "online", 00:13:47.450 "raid_level": "raid1", 00:13:47.450 "superblock": true, 00:13:47.450 "num_base_bdevs": 2, 00:13:47.450 "num_base_bdevs_discovered": 2, 00:13:47.450 "num_base_bdevs_operational": 2, 00:13:47.450 "base_bdevs_list": [ 00:13:47.450 { 00:13:47.450 "name": "BaseBdev1", 00:13:47.450 "uuid": "30741ad2-21ae-5cba-96a5-6d54db34fb70", 00:13:47.450 "is_configured": true, 00:13:47.450 "data_offset": 2048, 00:13:47.450 "data_size": 63488 00:13:47.450 }, 00:13:47.450 { 00:13:47.450 "name": "BaseBdev2", 00:13:47.450 "uuid": "93edf9b4-9807-526e-9e02-c1a7d8c01bf5", 00:13:47.450 "is_configured": true, 00:13:47.450 "data_offset": 2048, 00:13:47.450 "data_size": 63488 00:13:47.450 } 00:13:47.450 ] 00:13:47.450 }' 00:13:47.450 10:21:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.450 10:21:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.016 10:21:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:48.016 10:21:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:48.274 [2024-07-15 10:21:25.240517] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc79c70 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:49.211 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.470 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.470 "name": "raid_bdev1", 00:13:49.470 "uuid": "2fae1d77-8a88-4057-8c1a-2b586ed3ee2b", 00:13:49.470 "strip_size_kb": 0, 00:13:49.470 "state": "online", 00:13:49.470 "raid_level": "raid1", 00:13:49.470 "superblock": true, 00:13:49.470 "num_base_bdevs": 2, 00:13:49.470 "num_base_bdevs_discovered": 2, 00:13:49.470 "num_base_bdevs_operational": 2, 00:13:49.470 "base_bdevs_list": [ 00:13:49.470 { 00:13:49.470 "name": "BaseBdev1", 00:13:49.470 "uuid": "30741ad2-21ae-5cba-96a5-6d54db34fb70", 00:13:49.470 "is_configured": true, 00:13:49.470 "data_offset": 2048, 00:13:49.470 "data_size": 63488 00:13:49.470 }, 00:13:49.470 { 00:13:49.471 "name": "BaseBdev2", 00:13:49.471 "uuid": "93edf9b4-9807-526e-9e02-c1a7d8c01bf5", 00:13:49.471 "is_configured": true, 00:13:49.471 "data_offset": 2048, 00:13:49.471 "data_size": 63488 00:13:49.471 } 00:13:49.471 ] 00:13:49.471 }' 00:13:49.471 10:21:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.471 10:21:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:50.409 [2024-07-15 10:21:27.479778] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:50.409 [2024-07-15 10:21:27.479820] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:50.409 [2024-07-15 10:21:27.483063] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.409 [2024-07-15 10:21:27.483097] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:50.409 [2024-07-15 10:21:27.483177] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:50.409 [2024-07-15 10:21:27.483188] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc7e320 name raid_bdev1, state offline 00:13:50.409 0 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 492999 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 492999 ']' 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 492999 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 492999 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 492999' 00:13:50.409 killing process with pid 492999 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 492999 00:13:50.409 [2024-07-15 10:21:27.549264] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:50.409 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 492999 00:13:50.409 [2024-07-15 10:21:27.560072] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.lRihcEPBlQ 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:50.668 00:13:50.668 real 0m6.274s 00:13:50.668 user 0m9.809s 00:13:50.668 sys 0m1.075s 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:50.668 10:21:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.668 ************************************ 00:13:50.668 END TEST raid_read_error_test 00:13:50.668 ************************************ 00:13:50.668 10:21:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:50.668 10:21:27 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:50.668 10:21:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:50.668 10:21:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:50.668 10:21:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:50.927 ************************************ 00:13:50.927 START TEST raid_write_error_test 00:13:50.927 ************************************ 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:50.927 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.E1ZYSgEZQE 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=493974 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 493974 /var/tmp/spdk-raid.sock 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 493974 ']' 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:50.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:50.928 10:21:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.928 [2024-07-15 10:21:27.948271] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:13:50.928 [2024-07-15 10:21:27.948327] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid493974 ] 00:13:50.928 [2024-07-15 10:21:28.062223] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:51.187 [2024-07-15 10:21:28.166692] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:51.187 [2024-07-15 10:21:28.226835] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.187 [2024-07-15 10:21:28.226871] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.755 10:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:51.755 10:21:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:51.755 10:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:51.755 10:21:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:52.013 BaseBdev1_malloc 00:13:52.013 10:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:52.304 true 00:13:52.304 10:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:52.304 [2024-07-15 10:21:29.400221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:52.304 [2024-07-15 10:21:29.400268] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:52.304 [2024-07-15 10:21:29.400287] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15c70d0 00:13:52.304 [2024-07-15 10:21:29.400299] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:52.304 [2024-07-15 10:21:29.402004] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:52.304 [2024-07-15 10:21:29.402033] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:52.304 BaseBdev1 00:13:52.304 10:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:52.304 10:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:52.571 BaseBdev2_malloc 00:13:52.571 10:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:52.829 true 00:13:52.829 10:21:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:53.087 [2024-07-15 10:21:30.110714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:53.087 [2024-07-15 10:21:30.110764] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:53.087 [2024-07-15 10:21:30.110784] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15cb910 00:13:53.087 [2024-07-15 10:21:30.110796] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:53.087 [2024-07-15 10:21:30.112354] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:53.087 [2024-07-15 10:21:30.112382] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:53.087 BaseBdev2 00:13:53.087 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:53.345 [2024-07-15 10:21:30.347351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:53.345 [2024-07-15 10:21:30.348534] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:53.345 [2024-07-15 10:21:30.348718] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15cd320 00:13:53.345 [2024-07-15 10:21:30.348731] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:53.345 [2024-07-15 10:21:30.348906] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1434d00 00:13:53.345 [2024-07-15 10:21:30.349059] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15cd320 00:13:53.345 [2024-07-15 10:21:30.349070] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15cd320 00:13:53.345 [2024-07-15 10:21:30.349176] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.345 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:53.604 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.604 "name": "raid_bdev1", 00:13:53.604 "uuid": "89f77c18-e891-4d1f-a435-5fd7987ec87d", 00:13:53.604 "strip_size_kb": 0, 00:13:53.604 "state": "online", 00:13:53.604 "raid_level": "raid1", 00:13:53.604 "superblock": true, 00:13:53.604 "num_base_bdevs": 2, 00:13:53.604 "num_base_bdevs_discovered": 2, 00:13:53.604 "num_base_bdevs_operational": 2, 00:13:53.604 "base_bdevs_list": [ 00:13:53.604 { 00:13:53.604 "name": "BaseBdev1", 00:13:53.604 "uuid": "115f851e-1974-5c8e-b9cd-2295decd4f24", 00:13:53.604 "is_configured": true, 00:13:53.604 "data_offset": 2048, 00:13:53.604 "data_size": 63488 00:13:53.604 }, 00:13:53.604 { 00:13:53.604 "name": "BaseBdev2", 00:13:53.604 "uuid": "86e62f10-36ee-522c-85be-b9a533d8694e", 00:13:53.604 "is_configured": true, 00:13:53.604 "data_offset": 2048, 00:13:53.604 "data_size": 63488 00:13:53.604 } 00:13:53.604 ] 00:13:53.604 }' 00:13:53.604 10:21:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.604 10:21:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.195 10:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:54.195 10:21:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:54.195 [2024-07-15 10:21:31.310212] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15c8c70 00:13:55.130 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:55.389 [2024-07-15 10:21:32.433636] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:55.389 [2024-07-15 10:21:32.433702] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:55.389 [2024-07-15 10:21:32.433881] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x15c8c70 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.389 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:55.646 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.646 "name": "raid_bdev1", 00:13:55.646 "uuid": "89f77c18-e891-4d1f-a435-5fd7987ec87d", 00:13:55.646 "strip_size_kb": 0, 00:13:55.646 "state": "online", 00:13:55.646 "raid_level": "raid1", 00:13:55.646 "superblock": true, 00:13:55.646 "num_base_bdevs": 2, 00:13:55.646 "num_base_bdevs_discovered": 1, 00:13:55.646 "num_base_bdevs_operational": 1, 00:13:55.646 "base_bdevs_list": [ 00:13:55.646 { 00:13:55.646 "name": null, 00:13:55.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.646 "is_configured": false, 00:13:55.646 "data_offset": 2048, 00:13:55.646 "data_size": 63488 00:13:55.646 }, 00:13:55.646 { 00:13:55.646 "name": "BaseBdev2", 00:13:55.646 "uuid": "86e62f10-36ee-522c-85be-b9a533d8694e", 00:13:55.646 "is_configured": true, 00:13:55.646 "data_offset": 2048, 00:13:55.646 "data_size": 63488 00:13:55.646 } 00:13:55.646 ] 00:13:55.646 }' 00:13:55.646 10:21:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.646 10:21:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.213 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:56.471 [2024-07-15 10:21:33.481109] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:56.471 [2024-07-15 10:21:33.481148] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:56.471 [2024-07-15 10:21:33.484284] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:56.471 [2024-07-15 10:21:33.484312] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:56.471 [2024-07-15 10:21:33.484365] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:56.471 [2024-07-15 10:21:33.484377] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15cd320 name raid_bdev1, state offline 00:13:56.471 0 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 493974 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 493974 ']' 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 493974 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 493974 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 493974' 00:13:56.471 killing process with pid 493974 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 493974 00:13:56.471 [2024-07-15 10:21:33.553005] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:56.471 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 493974 00:13:56.471 [2024-07-15 10:21:33.565146] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.E1ZYSgEZQE 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:56.730 00:13:56.730 real 0m5.929s 00:13:56.730 user 0m9.184s 00:13:56.730 sys 0m1.042s 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:56.730 10:21:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.730 ************************************ 00:13:56.730 END TEST raid_write_error_test 00:13:56.730 ************************************ 00:13:56.730 10:21:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:56.730 10:21:33 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:56.730 10:21:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:56.730 10:21:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:56.730 10:21:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:56.730 10:21:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:56.730 10:21:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:56.730 ************************************ 00:13:56.730 START TEST raid_state_function_test 00:13:56.730 ************************************ 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=494791 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 494791' 00:13:56.730 Process raid pid: 494791 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 494791 /var/tmp/spdk-raid.sock 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 494791 ']' 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:56.730 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:56.730 10:21:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.988 [2024-07-15 10:21:33.973276] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:13:56.988 [2024-07-15 10:21:33.973344] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:56.988 [2024-07-15 10:21:34.096713] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:57.247 [2024-07-15 10:21:34.203445] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:57.247 [2024-07-15 10:21:34.269436] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:57.247 [2024-07-15 10:21:34.269465] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:57.811 10:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:57.811 10:21:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:57.811 10:21:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:58.069 [2024-07-15 10:21:35.128919] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:58.069 [2024-07-15 10:21:35.128968] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:58.069 [2024-07-15 10:21:35.128978] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:58.069 [2024-07-15 10:21:35.128990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:58.069 [2024-07-15 10:21:35.128999] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:58.069 [2024-07-15 10:21:35.129010] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.069 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.327 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:58.327 "name": "Existed_Raid", 00:13:58.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.327 "strip_size_kb": 64, 00:13:58.327 "state": "configuring", 00:13:58.327 "raid_level": "raid0", 00:13:58.327 "superblock": false, 00:13:58.327 "num_base_bdevs": 3, 00:13:58.327 "num_base_bdevs_discovered": 0, 00:13:58.327 "num_base_bdevs_operational": 3, 00:13:58.327 "base_bdevs_list": [ 00:13:58.327 { 00:13:58.327 "name": "BaseBdev1", 00:13:58.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.327 "is_configured": false, 00:13:58.327 "data_offset": 0, 00:13:58.327 "data_size": 0 00:13:58.327 }, 00:13:58.327 { 00:13:58.327 "name": "BaseBdev2", 00:13:58.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.327 "is_configured": false, 00:13:58.327 "data_offset": 0, 00:13:58.327 "data_size": 0 00:13:58.327 }, 00:13:58.327 { 00:13:58.327 "name": "BaseBdev3", 00:13:58.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.327 "is_configured": false, 00:13:58.327 "data_offset": 0, 00:13:58.327 "data_size": 0 00:13:58.327 } 00:13:58.327 ] 00:13:58.327 }' 00:13:58.327 10:21:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:58.327 10:21:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.891 10:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:59.149 [2024-07-15 10:21:36.263784] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:59.149 [2024-07-15 10:21:36.263815] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e4a80 name Existed_Raid, state configuring 00:13:59.149 10:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:59.407 [2024-07-15 10:21:36.508452] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:59.407 [2024-07-15 10:21:36.508484] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:59.407 [2024-07-15 10:21:36.508494] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:59.407 [2024-07-15 10:21:36.508505] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:59.407 [2024-07-15 10:21:36.508514] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:59.407 [2024-07-15 10:21:36.508526] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:59.407 10:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:59.666 [2024-07-15 10:21:36.764189] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:59.666 BaseBdev1 00:13:59.666 10:21:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:59.667 10:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:59.667 10:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:59.667 10:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:59.667 10:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:59.667 10:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:59.667 10:21:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:59.925 10:21:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:00.183 [ 00:14:00.183 { 00:14:00.183 "name": "BaseBdev1", 00:14:00.183 "aliases": [ 00:14:00.183 "1088f99e-ec71-4f61-b901-ce764e407568" 00:14:00.183 ], 00:14:00.183 "product_name": "Malloc disk", 00:14:00.183 "block_size": 512, 00:14:00.183 "num_blocks": 65536, 00:14:00.183 "uuid": "1088f99e-ec71-4f61-b901-ce764e407568", 00:14:00.183 "assigned_rate_limits": { 00:14:00.183 "rw_ios_per_sec": 0, 00:14:00.183 "rw_mbytes_per_sec": 0, 00:14:00.183 "r_mbytes_per_sec": 0, 00:14:00.183 "w_mbytes_per_sec": 0 00:14:00.183 }, 00:14:00.183 "claimed": true, 00:14:00.183 "claim_type": "exclusive_write", 00:14:00.183 "zoned": false, 00:14:00.183 "supported_io_types": { 00:14:00.183 "read": true, 00:14:00.183 "write": true, 00:14:00.183 "unmap": true, 00:14:00.183 "flush": true, 00:14:00.183 "reset": true, 00:14:00.183 "nvme_admin": false, 00:14:00.183 "nvme_io": false, 00:14:00.183 "nvme_io_md": false, 00:14:00.183 "write_zeroes": true, 00:14:00.183 "zcopy": true, 00:14:00.183 "get_zone_info": false, 00:14:00.183 "zone_management": false, 00:14:00.183 "zone_append": false, 00:14:00.183 "compare": false, 00:14:00.183 "compare_and_write": false, 00:14:00.183 "abort": true, 00:14:00.183 "seek_hole": false, 00:14:00.183 "seek_data": false, 00:14:00.183 "copy": true, 00:14:00.183 "nvme_iov_md": false 00:14:00.183 }, 00:14:00.183 "memory_domains": [ 00:14:00.183 { 00:14:00.183 "dma_device_id": "system", 00:14:00.183 "dma_device_type": 1 00:14:00.183 }, 00:14:00.183 { 00:14:00.183 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.183 "dma_device_type": 2 00:14:00.183 } 00:14:00.183 ], 00:14:00.183 "driver_specific": {} 00:14:00.183 } 00:14:00.183 ] 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.183 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.441 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.441 "name": "Existed_Raid", 00:14:00.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.441 "strip_size_kb": 64, 00:14:00.441 "state": "configuring", 00:14:00.441 "raid_level": "raid0", 00:14:00.441 "superblock": false, 00:14:00.441 "num_base_bdevs": 3, 00:14:00.441 "num_base_bdevs_discovered": 1, 00:14:00.441 "num_base_bdevs_operational": 3, 00:14:00.441 "base_bdevs_list": [ 00:14:00.441 { 00:14:00.441 "name": "BaseBdev1", 00:14:00.441 "uuid": "1088f99e-ec71-4f61-b901-ce764e407568", 00:14:00.441 "is_configured": true, 00:14:00.441 "data_offset": 0, 00:14:00.441 "data_size": 65536 00:14:00.441 }, 00:14:00.441 { 00:14:00.441 "name": "BaseBdev2", 00:14:00.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.441 "is_configured": false, 00:14:00.441 "data_offset": 0, 00:14:00.441 "data_size": 0 00:14:00.441 }, 00:14:00.441 { 00:14:00.441 "name": "BaseBdev3", 00:14:00.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.441 "is_configured": false, 00:14:00.441 "data_offset": 0, 00:14:00.441 "data_size": 0 00:14:00.441 } 00:14:00.441 ] 00:14:00.441 }' 00:14:00.441 10:21:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.442 10:21:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.007 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:01.265 [2024-07-15 10:21:38.284226] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:01.265 [2024-07-15 10:21:38.284263] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e4310 name Existed_Raid, state configuring 00:14:01.265 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:01.523 [2024-07-15 10:21:38.528902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:01.523 [2024-07-15 10:21:38.530388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:01.523 [2024-07-15 10:21:38.530421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:01.523 [2024-07-15 10:21:38.530431] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:01.523 [2024-07-15 10:21:38.530443] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.523 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.781 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.781 "name": "Existed_Raid", 00:14:01.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.781 "strip_size_kb": 64, 00:14:01.781 "state": "configuring", 00:14:01.781 "raid_level": "raid0", 00:14:01.781 "superblock": false, 00:14:01.781 "num_base_bdevs": 3, 00:14:01.781 "num_base_bdevs_discovered": 1, 00:14:01.781 "num_base_bdevs_operational": 3, 00:14:01.781 "base_bdevs_list": [ 00:14:01.782 { 00:14:01.782 "name": "BaseBdev1", 00:14:01.782 "uuid": "1088f99e-ec71-4f61-b901-ce764e407568", 00:14:01.782 "is_configured": true, 00:14:01.782 "data_offset": 0, 00:14:01.782 "data_size": 65536 00:14:01.782 }, 00:14:01.782 { 00:14:01.782 "name": "BaseBdev2", 00:14:01.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.782 "is_configured": false, 00:14:01.782 "data_offset": 0, 00:14:01.782 "data_size": 0 00:14:01.782 }, 00:14:01.782 { 00:14:01.782 "name": "BaseBdev3", 00:14:01.782 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.782 "is_configured": false, 00:14:01.782 "data_offset": 0, 00:14:01.782 "data_size": 0 00:14:01.782 } 00:14:01.782 ] 00:14:01.782 }' 00:14:01.782 10:21:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.782 10:21:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.347 10:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:02.605 [2024-07-15 10:21:39.615160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:02.605 BaseBdev2 00:14:02.605 10:21:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:02.605 10:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:02.605 10:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:02.605 10:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:02.605 10:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:02.605 10:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:02.605 10:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:02.863 10:21:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:03.130 [ 00:14:03.130 { 00:14:03.130 "name": "BaseBdev2", 00:14:03.130 "aliases": [ 00:14:03.130 "c93a4a12-1c04-41ea-b992-bb872e544a23" 00:14:03.130 ], 00:14:03.130 "product_name": "Malloc disk", 00:14:03.130 "block_size": 512, 00:14:03.130 "num_blocks": 65536, 00:14:03.130 "uuid": "c93a4a12-1c04-41ea-b992-bb872e544a23", 00:14:03.130 "assigned_rate_limits": { 00:14:03.130 "rw_ios_per_sec": 0, 00:14:03.130 "rw_mbytes_per_sec": 0, 00:14:03.130 "r_mbytes_per_sec": 0, 00:14:03.130 "w_mbytes_per_sec": 0 00:14:03.130 }, 00:14:03.130 "claimed": true, 00:14:03.130 "claim_type": "exclusive_write", 00:14:03.130 "zoned": false, 00:14:03.130 "supported_io_types": { 00:14:03.130 "read": true, 00:14:03.130 "write": true, 00:14:03.130 "unmap": true, 00:14:03.130 "flush": true, 00:14:03.130 "reset": true, 00:14:03.130 "nvme_admin": false, 00:14:03.130 "nvme_io": false, 00:14:03.130 "nvme_io_md": false, 00:14:03.130 "write_zeroes": true, 00:14:03.130 "zcopy": true, 00:14:03.130 "get_zone_info": false, 00:14:03.130 "zone_management": false, 00:14:03.130 "zone_append": false, 00:14:03.130 "compare": false, 00:14:03.130 "compare_and_write": false, 00:14:03.130 "abort": true, 00:14:03.130 "seek_hole": false, 00:14:03.130 "seek_data": false, 00:14:03.130 "copy": true, 00:14:03.130 "nvme_iov_md": false 00:14:03.130 }, 00:14:03.130 "memory_domains": [ 00:14:03.130 { 00:14:03.130 "dma_device_id": "system", 00:14:03.130 "dma_device_type": 1 00:14:03.130 }, 00:14:03.130 { 00:14:03.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.130 "dma_device_type": 2 00:14:03.130 } 00:14:03.130 ], 00:14:03.130 "driver_specific": {} 00:14:03.130 } 00:14:03.130 ] 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.130 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.389 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.389 "name": "Existed_Raid", 00:14:03.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.389 "strip_size_kb": 64, 00:14:03.389 "state": "configuring", 00:14:03.389 "raid_level": "raid0", 00:14:03.389 "superblock": false, 00:14:03.389 "num_base_bdevs": 3, 00:14:03.389 "num_base_bdevs_discovered": 2, 00:14:03.389 "num_base_bdevs_operational": 3, 00:14:03.389 "base_bdevs_list": [ 00:14:03.389 { 00:14:03.389 "name": "BaseBdev1", 00:14:03.389 "uuid": "1088f99e-ec71-4f61-b901-ce764e407568", 00:14:03.389 "is_configured": true, 00:14:03.389 "data_offset": 0, 00:14:03.389 "data_size": 65536 00:14:03.389 }, 00:14:03.389 { 00:14:03.389 "name": "BaseBdev2", 00:14:03.389 "uuid": "c93a4a12-1c04-41ea-b992-bb872e544a23", 00:14:03.389 "is_configured": true, 00:14:03.389 "data_offset": 0, 00:14:03.389 "data_size": 65536 00:14:03.389 }, 00:14:03.389 { 00:14:03.389 "name": "BaseBdev3", 00:14:03.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.389 "is_configured": false, 00:14:03.389 "data_offset": 0, 00:14:03.389 "data_size": 0 00:14:03.389 } 00:14:03.389 ] 00:14:03.389 }' 00:14:03.389 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.389 10:21:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.955 10:21:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:04.215 [2024-07-15 10:21:41.182697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:04.215 [2024-07-15 10:21:41.182743] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11e5400 00:14:04.215 [2024-07-15 10:21:41.182753] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:04.215 [2024-07-15 10:21:41.183007] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e4ef0 00:14:04.215 [2024-07-15 10:21:41.183127] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11e5400 00:14:04.215 [2024-07-15 10:21:41.183137] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11e5400 00:14:04.215 [2024-07-15 10:21:41.183291] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:04.215 BaseBdev3 00:14:04.215 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:04.215 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:04.215 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:04.215 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:04.215 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:04.215 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:04.215 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.474 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:04.733 [ 00:14:04.733 { 00:14:04.733 "name": "BaseBdev3", 00:14:04.733 "aliases": [ 00:14:04.733 "925d1f1d-2daf-41b0-b2c7-90d1eb73bfae" 00:14:04.733 ], 00:14:04.733 "product_name": "Malloc disk", 00:14:04.733 "block_size": 512, 00:14:04.733 "num_blocks": 65536, 00:14:04.733 "uuid": "925d1f1d-2daf-41b0-b2c7-90d1eb73bfae", 00:14:04.733 "assigned_rate_limits": { 00:14:04.733 "rw_ios_per_sec": 0, 00:14:04.733 "rw_mbytes_per_sec": 0, 00:14:04.733 "r_mbytes_per_sec": 0, 00:14:04.733 "w_mbytes_per_sec": 0 00:14:04.733 }, 00:14:04.733 "claimed": true, 00:14:04.733 "claim_type": "exclusive_write", 00:14:04.733 "zoned": false, 00:14:04.733 "supported_io_types": { 00:14:04.733 "read": true, 00:14:04.733 "write": true, 00:14:04.733 "unmap": true, 00:14:04.733 "flush": true, 00:14:04.733 "reset": true, 00:14:04.733 "nvme_admin": false, 00:14:04.733 "nvme_io": false, 00:14:04.733 "nvme_io_md": false, 00:14:04.733 "write_zeroes": true, 00:14:04.733 "zcopy": true, 00:14:04.733 "get_zone_info": false, 00:14:04.733 "zone_management": false, 00:14:04.733 "zone_append": false, 00:14:04.733 "compare": false, 00:14:04.733 "compare_and_write": false, 00:14:04.733 "abort": true, 00:14:04.733 "seek_hole": false, 00:14:04.733 "seek_data": false, 00:14:04.733 "copy": true, 00:14:04.733 "nvme_iov_md": false 00:14:04.733 }, 00:14:04.733 "memory_domains": [ 00:14:04.733 { 00:14:04.733 "dma_device_id": "system", 00:14:04.733 "dma_device_type": 1 00:14:04.733 }, 00:14:04.733 { 00:14:04.733 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.734 "dma_device_type": 2 00:14:04.734 } 00:14:04.734 ], 00:14:04.734 "driver_specific": {} 00:14:04.734 } 00:14:04.734 ] 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.734 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.993 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.993 "name": "Existed_Raid", 00:14:04.993 "uuid": "1d5dbcc6-5c41-457e-b00b-b538d4e1fc84", 00:14:04.993 "strip_size_kb": 64, 00:14:04.993 "state": "online", 00:14:04.993 "raid_level": "raid0", 00:14:04.993 "superblock": false, 00:14:04.993 "num_base_bdevs": 3, 00:14:04.993 "num_base_bdevs_discovered": 3, 00:14:04.993 "num_base_bdevs_operational": 3, 00:14:04.993 "base_bdevs_list": [ 00:14:04.993 { 00:14:04.993 "name": "BaseBdev1", 00:14:04.993 "uuid": "1088f99e-ec71-4f61-b901-ce764e407568", 00:14:04.993 "is_configured": true, 00:14:04.993 "data_offset": 0, 00:14:04.993 "data_size": 65536 00:14:04.993 }, 00:14:04.993 { 00:14:04.993 "name": "BaseBdev2", 00:14:04.993 "uuid": "c93a4a12-1c04-41ea-b992-bb872e544a23", 00:14:04.993 "is_configured": true, 00:14:04.993 "data_offset": 0, 00:14:04.993 "data_size": 65536 00:14:04.993 }, 00:14:04.993 { 00:14:04.993 "name": "BaseBdev3", 00:14:04.993 "uuid": "925d1f1d-2daf-41b0-b2c7-90d1eb73bfae", 00:14:04.993 "is_configured": true, 00:14:04.993 "data_offset": 0, 00:14:04.993 "data_size": 65536 00:14:04.993 } 00:14:04.993 ] 00:14:04.993 }' 00:14:04.993 10:21:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.993 10:21:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:05.929 10:21:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:05.929 [2024-07-15 10:21:43.007843] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:05.929 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:05.929 "name": "Existed_Raid", 00:14:05.929 "aliases": [ 00:14:05.929 "1d5dbcc6-5c41-457e-b00b-b538d4e1fc84" 00:14:05.929 ], 00:14:05.929 "product_name": "Raid Volume", 00:14:05.929 "block_size": 512, 00:14:05.929 "num_blocks": 196608, 00:14:05.929 "uuid": "1d5dbcc6-5c41-457e-b00b-b538d4e1fc84", 00:14:05.929 "assigned_rate_limits": { 00:14:05.929 "rw_ios_per_sec": 0, 00:14:05.929 "rw_mbytes_per_sec": 0, 00:14:05.929 "r_mbytes_per_sec": 0, 00:14:05.929 "w_mbytes_per_sec": 0 00:14:05.929 }, 00:14:05.929 "claimed": false, 00:14:05.929 "zoned": false, 00:14:05.929 "supported_io_types": { 00:14:05.929 "read": true, 00:14:05.929 "write": true, 00:14:05.929 "unmap": true, 00:14:05.929 "flush": true, 00:14:05.929 "reset": true, 00:14:05.929 "nvme_admin": false, 00:14:05.929 "nvme_io": false, 00:14:05.929 "nvme_io_md": false, 00:14:05.929 "write_zeroes": true, 00:14:05.929 "zcopy": false, 00:14:05.929 "get_zone_info": false, 00:14:05.929 "zone_management": false, 00:14:05.929 "zone_append": false, 00:14:05.929 "compare": false, 00:14:05.929 "compare_and_write": false, 00:14:05.929 "abort": false, 00:14:05.929 "seek_hole": false, 00:14:05.929 "seek_data": false, 00:14:05.929 "copy": false, 00:14:05.929 "nvme_iov_md": false 00:14:05.929 }, 00:14:05.929 "memory_domains": [ 00:14:05.929 { 00:14:05.929 "dma_device_id": "system", 00:14:05.929 "dma_device_type": 1 00:14:05.929 }, 00:14:05.929 { 00:14:05.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.929 "dma_device_type": 2 00:14:05.929 }, 00:14:05.929 { 00:14:05.929 "dma_device_id": "system", 00:14:05.929 "dma_device_type": 1 00:14:05.929 }, 00:14:05.929 { 00:14:05.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.929 "dma_device_type": 2 00:14:05.929 }, 00:14:05.929 { 00:14:05.929 "dma_device_id": "system", 00:14:05.929 "dma_device_type": 1 00:14:05.929 }, 00:14:05.929 { 00:14:05.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.929 "dma_device_type": 2 00:14:05.929 } 00:14:05.929 ], 00:14:05.929 "driver_specific": { 00:14:05.929 "raid": { 00:14:05.929 "uuid": "1d5dbcc6-5c41-457e-b00b-b538d4e1fc84", 00:14:05.929 "strip_size_kb": 64, 00:14:05.929 "state": "online", 00:14:05.929 "raid_level": "raid0", 00:14:05.929 "superblock": false, 00:14:05.929 "num_base_bdevs": 3, 00:14:05.929 "num_base_bdevs_discovered": 3, 00:14:05.929 "num_base_bdevs_operational": 3, 00:14:05.929 "base_bdevs_list": [ 00:14:05.929 { 00:14:05.929 "name": "BaseBdev1", 00:14:05.929 "uuid": "1088f99e-ec71-4f61-b901-ce764e407568", 00:14:05.929 "is_configured": true, 00:14:05.929 "data_offset": 0, 00:14:05.929 "data_size": 65536 00:14:05.929 }, 00:14:05.929 { 00:14:05.929 "name": "BaseBdev2", 00:14:05.929 "uuid": "c93a4a12-1c04-41ea-b992-bb872e544a23", 00:14:05.929 "is_configured": true, 00:14:05.929 "data_offset": 0, 00:14:05.929 "data_size": 65536 00:14:05.929 }, 00:14:05.929 { 00:14:05.929 "name": "BaseBdev3", 00:14:05.929 "uuid": "925d1f1d-2daf-41b0-b2c7-90d1eb73bfae", 00:14:05.929 "is_configured": true, 00:14:05.929 "data_offset": 0, 00:14:05.929 "data_size": 65536 00:14:05.929 } 00:14:05.929 ] 00:14:05.929 } 00:14:05.929 } 00:14:05.929 }' 00:14:05.929 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:05.929 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:05.929 BaseBdev2 00:14:05.929 BaseBdev3' 00:14:05.929 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:05.929 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:05.929 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.187 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.187 "name": "BaseBdev1", 00:14:06.187 "aliases": [ 00:14:06.187 "1088f99e-ec71-4f61-b901-ce764e407568" 00:14:06.187 ], 00:14:06.187 "product_name": "Malloc disk", 00:14:06.187 "block_size": 512, 00:14:06.187 "num_blocks": 65536, 00:14:06.187 "uuid": "1088f99e-ec71-4f61-b901-ce764e407568", 00:14:06.187 "assigned_rate_limits": { 00:14:06.187 "rw_ios_per_sec": 0, 00:14:06.187 "rw_mbytes_per_sec": 0, 00:14:06.187 "r_mbytes_per_sec": 0, 00:14:06.187 "w_mbytes_per_sec": 0 00:14:06.187 }, 00:14:06.187 "claimed": true, 00:14:06.187 "claim_type": "exclusive_write", 00:14:06.187 "zoned": false, 00:14:06.187 "supported_io_types": { 00:14:06.187 "read": true, 00:14:06.187 "write": true, 00:14:06.187 "unmap": true, 00:14:06.187 "flush": true, 00:14:06.187 "reset": true, 00:14:06.187 "nvme_admin": false, 00:14:06.187 "nvme_io": false, 00:14:06.187 "nvme_io_md": false, 00:14:06.187 "write_zeroes": true, 00:14:06.187 "zcopy": true, 00:14:06.187 "get_zone_info": false, 00:14:06.187 "zone_management": false, 00:14:06.187 "zone_append": false, 00:14:06.187 "compare": false, 00:14:06.187 "compare_and_write": false, 00:14:06.187 "abort": true, 00:14:06.187 "seek_hole": false, 00:14:06.187 "seek_data": false, 00:14:06.187 "copy": true, 00:14:06.187 "nvme_iov_md": false 00:14:06.187 }, 00:14:06.187 "memory_domains": [ 00:14:06.187 { 00:14:06.187 "dma_device_id": "system", 00:14:06.187 "dma_device_type": 1 00:14:06.187 }, 00:14:06.187 { 00:14:06.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.187 "dma_device_type": 2 00:14:06.187 } 00:14:06.187 ], 00:14:06.187 "driver_specific": {} 00:14:06.187 }' 00:14:06.187 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.187 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.444 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.762 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.762 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.762 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:06.762 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.046 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.046 "name": "BaseBdev2", 00:14:07.046 "aliases": [ 00:14:07.046 "c93a4a12-1c04-41ea-b992-bb872e544a23" 00:14:07.046 ], 00:14:07.046 "product_name": "Malloc disk", 00:14:07.046 "block_size": 512, 00:14:07.046 "num_blocks": 65536, 00:14:07.046 "uuid": "c93a4a12-1c04-41ea-b992-bb872e544a23", 00:14:07.046 "assigned_rate_limits": { 00:14:07.046 "rw_ios_per_sec": 0, 00:14:07.046 "rw_mbytes_per_sec": 0, 00:14:07.046 "r_mbytes_per_sec": 0, 00:14:07.046 "w_mbytes_per_sec": 0 00:14:07.046 }, 00:14:07.046 "claimed": true, 00:14:07.046 "claim_type": "exclusive_write", 00:14:07.046 "zoned": false, 00:14:07.046 "supported_io_types": { 00:14:07.046 "read": true, 00:14:07.046 "write": true, 00:14:07.046 "unmap": true, 00:14:07.046 "flush": true, 00:14:07.046 "reset": true, 00:14:07.046 "nvme_admin": false, 00:14:07.046 "nvme_io": false, 00:14:07.046 "nvme_io_md": false, 00:14:07.046 "write_zeroes": true, 00:14:07.046 "zcopy": true, 00:14:07.046 "get_zone_info": false, 00:14:07.046 "zone_management": false, 00:14:07.047 "zone_append": false, 00:14:07.047 "compare": false, 00:14:07.047 "compare_and_write": false, 00:14:07.047 "abort": true, 00:14:07.047 "seek_hole": false, 00:14:07.047 "seek_data": false, 00:14:07.047 "copy": true, 00:14:07.047 "nvme_iov_md": false 00:14:07.047 }, 00:14:07.047 "memory_domains": [ 00:14:07.047 { 00:14:07.047 "dma_device_id": "system", 00:14:07.047 "dma_device_type": 1 00:14:07.047 }, 00:14:07.047 { 00:14:07.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.047 "dma_device_type": 2 00:14:07.047 } 00:14:07.047 ], 00:14:07.047 "driver_specific": {} 00:14:07.047 }' 00:14:07.047 10:21:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.047 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.047 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.047 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.047 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.047 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.047 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.047 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.305 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.305 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.305 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.305 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.305 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:07.305 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:07.305 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.563 "name": "BaseBdev3", 00:14:07.563 "aliases": [ 00:14:07.563 "925d1f1d-2daf-41b0-b2c7-90d1eb73bfae" 00:14:07.563 ], 00:14:07.563 "product_name": "Malloc disk", 00:14:07.563 "block_size": 512, 00:14:07.563 "num_blocks": 65536, 00:14:07.563 "uuid": "925d1f1d-2daf-41b0-b2c7-90d1eb73bfae", 00:14:07.563 "assigned_rate_limits": { 00:14:07.563 "rw_ios_per_sec": 0, 00:14:07.563 "rw_mbytes_per_sec": 0, 00:14:07.563 "r_mbytes_per_sec": 0, 00:14:07.563 "w_mbytes_per_sec": 0 00:14:07.563 }, 00:14:07.563 "claimed": true, 00:14:07.563 "claim_type": "exclusive_write", 00:14:07.563 "zoned": false, 00:14:07.563 "supported_io_types": { 00:14:07.563 "read": true, 00:14:07.563 "write": true, 00:14:07.563 "unmap": true, 00:14:07.563 "flush": true, 00:14:07.563 "reset": true, 00:14:07.563 "nvme_admin": false, 00:14:07.563 "nvme_io": false, 00:14:07.563 "nvme_io_md": false, 00:14:07.563 "write_zeroes": true, 00:14:07.563 "zcopy": true, 00:14:07.563 "get_zone_info": false, 00:14:07.563 "zone_management": false, 00:14:07.563 "zone_append": false, 00:14:07.563 "compare": false, 00:14:07.563 "compare_and_write": false, 00:14:07.563 "abort": true, 00:14:07.563 "seek_hole": false, 00:14:07.563 "seek_data": false, 00:14:07.563 "copy": true, 00:14:07.563 "nvme_iov_md": false 00:14:07.563 }, 00:14:07.563 "memory_domains": [ 00:14:07.563 { 00:14:07.563 "dma_device_id": "system", 00:14:07.563 "dma_device_type": 1 00:14:07.563 }, 00:14:07.563 { 00:14:07.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.563 "dma_device_type": 2 00:14:07.563 } 00:14:07.563 ], 00:14:07.563 "driver_specific": {} 00:14:07.563 }' 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.563 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.822 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.822 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.822 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.822 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.822 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.822 10:21:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:08.080 [2024-07-15 10:21:45.137265] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:08.080 [2024-07-15 10:21:45.137290] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:08.080 [2024-07-15 10:21:45.137331] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.080 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.081 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.340 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:08.340 "name": "Existed_Raid", 00:14:08.340 "uuid": "1d5dbcc6-5c41-457e-b00b-b538d4e1fc84", 00:14:08.340 "strip_size_kb": 64, 00:14:08.340 "state": "offline", 00:14:08.340 "raid_level": "raid0", 00:14:08.340 "superblock": false, 00:14:08.340 "num_base_bdevs": 3, 00:14:08.340 "num_base_bdevs_discovered": 2, 00:14:08.340 "num_base_bdevs_operational": 2, 00:14:08.340 "base_bdevs_list": [ 00:14:08.340 { 00:14:08.340 "name": null, 00:14:08.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:08.340 "is_configured": false, 00:14:08.340 "data_offset": 0, 00:14:08.340 "data_size": 65536 00:14:08.340 }, 00:14:08.340 { 00:14:08.340 "name": "BaseBdev2", 00:14:08.340 "uuid": "c93a4a12-1c04-41ea-b992-bb872e544a23", 00:14:08.340 "is_configured": true, 00:14:08.340 "data_offset": 0, 00:14:08.340 "data_size": 65536 00:14:08.340 }, 00:14:08.340 { 00:14:08.340 "name": "BaseBdev3", 00:14:08.340 "uuid": "925d1f1d-2daf-41b0-b2c7-90d1eb73bfae", 00:14:08.340 "is_configured": true, 00:14:08.340 "data_offset": 0, 00:14:08.340 "data_size": 65536 00:14:08.340 } 00:14:08.340 ] 00:14:08.340 }' 00:14:08.340 10:21:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:08.340 10:21:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.276 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:09.276 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:09.276 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:09.276 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.535 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:09.535 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:09.535 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:09.793 [2024-07-15 10:21:46.742605] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:09.793 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:09.793 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:09.793 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.793 10:21:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:10.052 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:10.052 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:10.052 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:10.052 [2024-07-15 10:21:47.228288] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:10.052 [2024-07-15 10:21:47.228330] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e5400 name Existed_Raid, state offline 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:10.310 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:10.568 BaseBdev2 00:14:10.568 10:21:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:10.568 10:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:10.568 10:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:10.568 10:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:10.568 10:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:10.568 10:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:10.568 10:21:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:11.136 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:11.395 [ 00:14:11.395 { 00:14:11.395 "name": "BaseBdev2", 00:14:11.395 "aliases": [ 00:14:11.395 "5e6f4eb8-eda4-44aa-8f66-be104566d3e4" 00:14:11.395 ], 00:14:11.395 "product_name": "Malloc disk", 00:14:11.395 "block_size": 512, 00:14:11.395 "num_blocks": 65536, 00:14:11.395 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:11.395 "assigned_rate_limits": { 00:14:11.395 "rw_ios_per_sec": 0, 00:14:11.395 "rw_mbytes_per_sec": 0, 00:14:11.395 "r_mbytes_per_sec": 0, 00:14:11.395 "w_mbytes_per_sec": 0 00:14:11.395 }, 00:14:11.395 "claimed": false, 00:14:11.395 "zoned": false, 00:14:11.395 "supported_io_types": { 00:14:11.395 "read": true, 00:14:11.395 "write": true, 00:14:11.395 "unmap": true, 00:14:11.395 "flush": true, 00:14:11.395 "reset": true, 00:14:11.395 "nvme_admin": false, 00:14:11.395 "nvme_io": false, 00:14:11.395 "nvme_io_md": false, 00:14:11.395 "write_zeroes": true, 00:14:11.395 "zcopy": true, 00:14:11.395 "get_zone_info": false, 00:14:11.395 "zone_management": false, 00:14:11.395 "zone_append": false, 00:14:11.395 "compare": false, 00:14:11.395 "compare_and_write": false, 00:14:11.395 "abort": true, 00:14:11.395 "seek_hole": false, 00:14:11.395 "seek_data": false, 00:14:11.395 "copy": true, 00:14:11.395 "nvme_iov_md": false 00:14:11.395 }, 00:14:11.395 "memory_domains": [ 00:14:11.395 { 00:14:11.395 "dma_device_id": "system", 00:14:11.395 "dma_device_type": 1 00:14:11.395 }, 00:14:11.395 { 00:14:11.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.395 "dma_device_type": 2 00:14:11.395 } 00:14:11.395 ], 00:14:11.395 "driver_specific": {} 00:14:11.395 } 00:14:11.395 ] 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:11.395 BaseBdev3 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:11.395 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:11.655 10:21:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:11.914 [ 00:14:11.914 { 00:14:11.914 "name": "BaseBdev3", 00:14:11.914 "aliases": [ 00:14:11.914 "0e4ca316-bb52-49da-96fa-2b06130fb224" 00:14:11.914 ], 00:14:11.914 "product_name": "Malloc disk", 00:14:11.914 "block_size": 512, 00:14:11.914 "num_blocks": 65536, 00:14:11.914 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:11.914 "assigned_rate_limits": { 00:14:11.914 "rw_ios_per_sec": 0, 00:14:11.914 "rw_mbytes_per_sec": 0, 00:14:11.914 "r_mbytes_per_sec": 0, 00:14:11.914 "w_mbytes_per_sec": 0 00:14:11.914 }, 00:14:11.914 "claimed": false, 00:14:11.914 "zoned": false, 00:14:11.914 "supported_io_types": { 00:14:11.914 "read": true, 00:14:11.914 "write": true, 00:14:11.914 "unmap": true, 00:14:11.914 "flush": true, 00:14:11.914 "reset": true, 00:14:11.914 "nvme_admin": false, 00:14:11.914 "nvme_io": false, 00:14:11.914 "nvme_io_md": false, 00:14:11.914 "write_zeroes": true, 00:14:11.914 "zcopy": true, 00:14:11.914 "get_zone_info": false, 00:14:11.914 "zone_management": false, 00:14:11.914 "zone_append": false, 00:14:11.914 "compare": false, 00:14:11.914 "compare_and_write": false, 00:14:11.914 "abort": true, 00:14:11.914 "seek_hole": false, 00:14:11.914 "seek_data": false, 00:14:11.914 "copy": true, 00:14:11.914 "nvme_iov_md": false 00:14:11.914 }, 00:14:11.914 "memory_domains": [ 00:14:11.914 { 00:14:11.914 "dma_device_id": "system", 00:14:11.914 "dma_device_type": 1 00:14:11.914 }, 00:14:11.914 { 00:14:11.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.914 "dma_device_type": 2 00:14:11.914 } 00:14:11.914 ], 00:14:11.914 "driver_specific": {} 00:14:11.914 } 00:14:11.914 ] 00:14:11.914 10:21:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:11.914 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:11.914 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:11.914 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:12.174 [2024-07-15 10:21:49.254170] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:12.174 [2024-07-15 10:21:49.254213] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:12.174 [2024-07-15 10:21:49.254235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:12.174 [2024-07-15 10:21:49.255655] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.174 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.433 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.433 "name": "Existed_Raid", 00:14:12.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.433 "strip_size_kb": 64, 00:14:12.433 "state": "configuring", 00:14:12.433 "raid_level": "raid0", 00:14:12.433 "superblock": false, 00:14:12.433 "num_base_bdevs": 3, 00:14:12.433 "num_base_bdevs_discovered": 2, 00:14:12.433 "num_base_bdevs_operational": 3, 00:14:12.433 "base_bdevs_list": [ 00:14:12.433 { 00:14:12.433 "name": "BaseBdev1", 00:14:12.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.433 "is_configured": false, 00:14:12.433 "data_offset": 0, 00:14:12.433 "data_size": 0 00:14:12.433 }, 00:14:12.433 { 00:14:12.433 "name": "BaseBdev2", 00:14:12.433 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:12.433 "is_configured": true, 00:14:12.433 "data_offset": 0, 00:14:12.433 "data_size": 65536 00:14:12.433 }, 00:14:12.433 { 00:14:12.433 "name": "BaseBdev3", 00:14:12.433 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:12.433 "is_configured": true, 00:14:12.433 "data_offset": 0, 00:14:12.433 "data_size": 65536 00:14:12.433 } 00:14:12.433 ] 00:14:12.433 }' 00:14:12.433 10:21:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.433 10:21:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.001 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:13.261 [2024-07-15 10:21:50.397182] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.261 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.520 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.520 "name": "Existed_Raid", 00:14:13.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.520 "strip_size_kb": 64, 00:14:13.520 "state": "configuring", 00:14:13.520 "raid_level": "raid0", 00:14:13.520 "superblock": false, 00:14:13.520 "num_base_bdevs": 3, 00:14:13.520 "num_base_bdevs_discovered": 1, 00:14:13.520 "num_base_bdevs_operational": 3, 00:14:13.520 "base_bdevs_list": [ 00:14:13.520 { 00:14:13.520 "name": "BaseBdev1", 00:14:13.520 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.520 "is_configured": false, 00:14:13.520 "data_offset": 0, 00:14:13.520 "data_size": 0 00:14:13.520 }, 00:14:13.520 { 00:14:13.520 "name": null, 00:14:13.520 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:13.520 "is_configured": false, 00:14:13.520 "data_offset": 0, 00:14:13.520 "data_size": 65536 00:14:13.520 }, 00:14:13.520 { 00:14:13.520 "name": "BaseBdev3", 00:14:13.520 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:13.520 "is_configured": true, 00:14:13.520 "data_offset": 0, 00:14:13.520 "data_size": 65536 00:14:13.520 } 00:14:13.520 ] 00:14:13.520 }' 00:14:13.520 10:21:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.520 10:21:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.088 10:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:14.088 10:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.347 10:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:14.347 10:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:14.607 [2024-07-15 10:21:51.744165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:14.607 BaseBdev1 00:14:14.607 10:21:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:14.607 10:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:14.607 10:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:14.607 10:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:14.607 10:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:14.607 10:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:14.607 10:21:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:14.866 10:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:15.125 [ 00:14:15.125 { 00:14:15.125 "name": "BaseBdev1", 00:14:15.125 "aliases": [ 00:14:15.125 "3707cdce-1293-4e61-8f2d-002c1756b0ff" 00:14:15.125 ], 00:14:15.125 "product_name": "Malloc disk", 00:14:15.125 "block_size": 512, 00:14:15.125 "num_blocks": 65536, 00:14:15.125 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:15.125 "assigned_rate_limits": { 00:14:15.125 "rw_ios_per_sec": 0, 00:14:15.125 "rw_mbytes_per_sec": 0, 00:14:15.125 "r_mbytes_per_sec": 0, 00:14:15.125 "w_mbytes_per_sec": 0 00:14:15.125 }, 00:14:15.125 "claimed": true, 00:14:15.125 "claim_type": "exclusive_write", 00:14:15.125 "zoned": false, 00:14:15.125 "supported_io_types": { 00:14:15.125 "read": true, 00:14:15.125 "write": true, 00:14:15.125 "unmap": true, 00:14:15.125 "flush": true, 00:14:15.125 "reset": true, 00:14:15.125 "nvme_admin": false, 00:14:15.125 "nvme_io": false, 00:14:15.125 "nvme_io_md": false, 00:14:15.125 "write_zeroes": true, 00:14:15.125 "zcopy": true, 00:14:15.125 "get_zone_info": false, 00:14:15.125 "zone_management": false, 00:14:15.125 "zone_append": false, 00:14:15.125 "compare": false, 00:14:15.125 "compare_and_write": false, 00:14:15.125 "abort": true, 00:14:15.125 "seek_hole": false, 00:14:15.125 "seek_data": false, 00:14:15.125 "copy": true, 00:14:15.125 "nvme_iov_md": false 00:14:15.125 }, 00:14:15.125 "memory_domains": [ 00:14:15.125 { 00:14:15.125 "dma_device_id": "system", 00:14:15.125 "dma_device_type": 1 00:14:15.125 }, 00:14:15.125 { 00:14:15.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:15.125 "dma_device_type": 2 00:14:15.125 } 00:14:15.125 ], 00:14:15.125 "driver_specific": {} 00:14:15.125 } 00:14:15.125 ] 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.125 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.126 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.126 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.385 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.385 "name": "Existed_Raid", 00:14:15.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.385 "strip_size_kb": 64, 00:14:15.385 "state": "configuring", 00:14:15.385 "raid_level": "raid0", 00:14:15.385 "superblock": false, 00:14:15.385 "num_base_bdevs": 3, 00:14:15.385 "num_base_bdevs_discovered": 2, 00:14:15.385 "num_base_bdevs_operational": 3, 00:14:15.385 "base_bdevs_list": [ 00:14:15.385 { 00:14:15.385 "name": "BaseBdev1", 00:14:15.385 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:15.385 "is_configured": true, 00:14:15.385 "data_offset": 0, 00:14:15.385 "data_size": 65536 00:14:15.385 }, 00:14:15.385 { 00:14:15.385 "name": null, 00:14:15.385 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:15.385 "is_configured": false, 00:14:15.385 "data_offset": 0, 00:14:15.385 "data_size": 65536 00:14:15.385 }, 00:14:15.385 { 00:14:15.385 "name": "BaseBdev3", 00:14:15.385 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:15.385 "is_configured": true, 00:14:15.385 "data_offset": 0, 00:14:15.385 "data_size": 65536 00:14:15.385 } 00:14:15.385 ] 00:14:15.385 }' 00:14:15.385 10:21:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.385 10:21:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.951 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.951 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:16.218 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:16.218 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:16.788 [2024-07-15 10:21:53.741473] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.788 10:21:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.047 10:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.047 "name": "Existed_Raid", 00:14:17.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.047 "strip_size_kb": 64, 00:14:17.047 "state": "configuring", 00:14:17.047 "raid_level": "raid0", 00:14:17.047 "superblock": false, 00:14:17.047 "num_base_bdevs": 3, 00:14:17.047 "num_base_bdevs_discovered": 1, 00:14:17.047 "num_base_bdevs_operational": 3, 00:14:17.047 "base_bdevs_list": [ 00:14:17.047 { 00:14:17.047 "name": "BaseBdev1", 00:14:17.047 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:17.047 "is_configured": true, 00:14:17.047 "data_offset": 0, 00:14:17.047 "data_size": 65536 00:14:17.047 }, 00:14:17.047 { 00:14:17.047 "name": null, 00:14:17.047 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:17.047 "is_configured": false, 00:14:17.047 "data_offset": 0, 00:14:17.047 "data_size": 65536 00:14:17.047 }, 00:14:17.047 { 00:14:17.047 "name": null, 00:14:17.047 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:17.047 "is_configured": false, 00:14:17.047 "data_offset": 0, 00:14:17.047 "data_size": 65536 00:14:17.047 } 00:14:17.047 ] 00:14:17.047 }' 00:14:17.047 10:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.047 10:21:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.614 10:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.614 10:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:17.874 10:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:17.874 10:21:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:18.133 [2024-07-15 10:21:55.081054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.133 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:18.392 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.392 "name": "Existed_Raid", 00:14:18.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:18.392 "strip_size_kb": 64, 00:14:18.392 "state": "configuring", 00:14:18.392 "raid_level": "raid0", 00:14:18.392 "superblock": false, 00:14:18.392 "num_base_bdevs": 3, 00:14:18.392 "num_base_bdevs_discovered": 2, 00:14:18.392 "num_base_bdevs_operational": 3, 00:14:18.392 "base_bdevs_list": [ 00:14:18.392 { 00:14:18.392 "name": "BaseBdev1", 00:14:18.392 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:18.392 "is_configured": true, 00:14:18.392 "data_offset": 0, 00:14:18.392 "data_size": 65536 00:14:18.392 }, 00:14:18.392 { 00:14:18.392 "name": null, 00:14:18.392 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:18.392 "is_configured": false, 00:14:18.392 "data_offset": 0, 00:14:18.392 "data_size": 65536 00:14:18.392 }, 00:14:18.392 { 00:14:18.392 "name": "BaseBdev3", 00:14:18.392 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:18.392 "is_configured": true, 00:14:18.392 "data_offset": 0, 00:14:18.392 "data_size": 65536 00:14:18.392 } 00:14:18.392 ] 00:14:18.392 }' 00:14:18.392 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.392 10:21:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.960 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.960 10:21:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:19.220 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:19.220 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:19.220 [2024-07-15 10:21:56.408579] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.480 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:19.739 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:19.739 "name": "Existed_Raid", 00:14:19.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:19.739 "strip_size_kb": 64, 00:14:19.739 "state": "configuring", 00:14:19.739 "raid_level": "raid0", 00:14:19.739 "superblock": false, 00:14:19.739 "num_base_bdevs": 3, 00:14:19.739 "num_base_bdevs_discovered": 1, 00:14:19.739 "num_base_bdevs_operational": 3, 00:14:19.739 "base_bdevs_list": [ 00:14:19.739 { 00:14:19.739 "name": null, 00:14:19.739 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:19.739 "is_configured": false, 00:14:19.739 "data_offset": 0, 00:14:19.739 "data_size": 65536 00:14:19.739 }, 00:14:19.739 { 00:14:19.739 "name": null, 00:14:19.739 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:19.739 "is_configured": false, 00:14:19.739 "data_offset": 0, 00:14:19.739 "data_size": 65536 00:14:19.739 }, 00:14:19.739 { 00:14:19.739 "name": "BaseBdev3", 00:14:19.739 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:19.739 "is_configured": true, 00:14:19.739 "data_offset": 0, 00:14:19.739 "data_size": 65536 00:14:19.739 } 00:14:19.739 ] 00:14:19.739 }' 00:14:19.739 10:21:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:19.739 10:21:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.307 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.307 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:20.565 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:20.565 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:20.565 [2024-07-15 10:21:57.736432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.825 10:21:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:20.825 10:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.825 "name": "Existed_Raid", 00:14:20.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.825 "strip_size_kb": 64, 00:14:20.825 "state": "configuring", 00:14:20.825 "raid_level": "raid0", 00:14:20.825 "superblock": false, 00:14:20.825 "num_base_bdevs": 3, 00:14:20.825 "num_base_bdevs_discovered": 2, 00:14:20.825 "num_base_bdevs_operational": 3, 00:14:20.825 "base_bdevs_list": [ 00:14:20.825 { 00:14:20.825 "name": null, 00:14:20.825 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:20.825 "is_configured": false, 00:14:20.825 "data_offset": 0, 00:14:20.825 "data_size": 65536 00:14:20.825 }, 00:14:20.825 { 00:14:20.825 "name": "BaseBdev2", 00:14:20.825 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:20.825 "is_configured": true, 00:14:20.825 "data_offset": 0, 00:14:20.825 "data_size": 65536 00:14:20.825 }, 00:14:20.825 { 00:14:20.825 "name": "BaseBdev3", 00:14:20.825 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:20.825 "is_configured": true, 00:14:20.825 "data_offset": 0, 00:14:20.825 "data_size": 65536 00:14:20.825 } 00:14:20.825 ] 00:14:20.825 }' 00:14:20.825 10:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.825 10:21:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.460 10:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.460 10:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:21.718 10:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:21.718 10:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.718 10:21:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:21.975 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3707cdce-1293-4e61-8f2d-002c1756b0ff 00:14:22.234 [2024-07-15 10:21:59.329238] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:22.234 [2024-07-15 10:21:59.329278] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11e3450 00:14:22.234 [2024-07-15 10:21:59.329287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:22.234 [2024-07-15 10:21:59.329477] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e4a50 00:14:22.234 [2024-07-15 10:21:59.329592] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11e3450 00:14:22.234 [2024-07-15 10:21:59.329601] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11e3450 00:14:22.234 [2024-07-15 10:21:59.329766] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.234 NewBaseBdev 00:14:22.234 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:22.234 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:22.234 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:22.234 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:22.234 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:22.234 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:22.234 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:22.493 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:22.752 [ 00:14:22.752 { 00:14:22.752 "name": "NewBaseBdev", 00:14:22.752 "aliases": [ 00:14:22.752 "3707cdce-1293-4e61-8f2d-002c1756b0ff" 00:14:22.752 ], 00:14:22.752 "product_name": "Malloc disk", 00:14:22.752 "block_size": 512, 00:14:22.752 "num_blocks": 65536, 00:14:22.752 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:22.752 "assigned_rate_limits": { 00:14:22.752 "rw_ios_per_sec": 0, 00:14:22.752 "rw_mbytes_per_sec": 0, 00:14:22.752 "r_mbytes_per_sec": 0, 00:14:22.752 "w_mbytes_per_sec": 0 00:14:22.752 }, 00:14:22.752 "claimed": true, 00:14:22.752 "claim_type": "exclusive_write", 00:14:22.752 "zoned": false, 00:14:22.752 "supported_io_types": { 00:14:22.752 "read": true, 00:14:22.752 "write": true, 00:14:22.752 "unmap": true, 00:14:22.752 "flush": true, 00:14:22.752 "reset": true, 00:14:22.752 "nvme_admin": false, 00:14:22.752 "nvme_io": false, 00:14:22.752 "nvme_io_md": false, 00:14:22.752 "write_zeroes": true, 00:14:22.752 "zcopy": true, 00:14:22.752 "get_zone_info": false, 00:14:22.752 "zone_management": false, 00:14:22.752 "zone_append": false, 00:14:22.752 "compare": false, 00:14:22.752 "compare_and_write": false, 00:14:22.752 "abort": true, 00:14:22.752 "seek_hole": false, 00:14:22.752 "seek_data": false, 00:14:22.752 "copy": true, 00:14:22.752 "nvme_iov_md": false 00:14:22.752 }, 00:14:22.752 "memory_domains": [ 00:14:22.752 { 00:14:22.752 "dma_device_id": "system", 00:14:22.752 "dma_device_type": 1 00:14:22.752 }, 00:14:22.752 { 00:14:22.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.752 "dma_device_type": 2 00:14:22.752 } 00:14:22.752 ], 00:14:22.752 "driver_specific": {} 00:14:22.752 } 00:14:22.752 ] 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.752 10:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:23.011 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:23.011 "name": "Existed_Raid", 00:14:23.011 "uuid": "077aa73e-2a88-4064-bff6-70a1f570c86f", 00:14:23.011 "strip_size_kb": 64, 00:14:23.011 "state": "online", 00:14:23.011 "raid_level": "raid0", 00:14:23.011 "superblock": false, 00:14:23.011 "num_base_bdevs": 3, 00:14:23.011 "num_base_bdevs_discovered": 3, 00:14:23.011 "num_base_bdevs_operational": 3, 00:14:23.011 "base_bdevs_list": [ 00:14:23.011 { 00:14:23.011 "name": "NewBaseBdev", 00:14:23.011 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:23.011 "is_configured": true, 00:14:23.011 "data_offset": 0, 00:14:23.011 "data_size": 65536 00:14:23.011 }, 00:14:23.011 { 00:14:23.011 "name": "BaseBdev2", 00:14:23.011 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:23.011 "is_configured": true, 00:14:23.011 "data_offset": 0, 00:14:23.011 "data_size": 65536 00:14:23.011 }, 00:14:23.011 { 00:14:23.011 "name": "BaseBdev3", 00:14:23.011 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:23.011 "is_configured": true, 00:14:23.011 "data_offset": 0, 00:14:23.011 "data_size": 65536 00:14:23.011 } 00:14:23.011 ] 00:14:23.011 }' 00:14:23.011 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:23.011 10:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:23.629 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:23.886 [2024-07-15 10:22:00.829550] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:23.886 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:23.886 "name": "Existed_Raid", 00:14:23.886 "aliases": [ 00:14:23.886 "077aa73e-2a88-4064-bff6-70a1f570c86f" 00:14:23.886 ], 00:14:23.886 "product_name": "Raid Volume", 00:14:23.886 "block_size": 512, 00:14:23.886 "num_blocks": 196608, 00:14:23.886 "uuid": "077aa73e-2a88-4064-bff6-70a1f570c86f", 00:14:23.886 "assigned_rate_limits": { 00:14:23.886 "rw_ios_per_sec": 0, 00:14:23.886 "rw_mbytes_per_sec": 0, 00:14:23.886 "r_mbytes_per_sec": 0, 00:14:23.886 "w_mbytes_per_sec": 0 00:14:23.886 }, 00:14:23.886 "claimed": false, 00:14:23.886 "zoned": false, 00:14:23.886 "supported_io_types": { 00:14:23.886 "read": true, 00:14:23.886 "write": true, 00:14:23.886 "unmap": true, 00:14:23.886 "flush": true, 00:14:23.886 "reset": true, 00:14:23.886 "nvme_admin": false, 00:14:23.886 "nvme_io": false, 00:14:23.886 "nvme_io_md": false, 00:14:23.886 "write_zeroes": true, 00:14:23.886 "zcopy": false, 00:14:23.886 "get_zone_info": false, 00:14:23.886 "zone_management": false, 00:14:23.886 "zone_append": false, 00:14:23.886 "compare": false, 00:14:23.886 "compare_and_write": false, 00:14:23.886 "abort": false, 00:14:23.886 "seek_hole": false, 00:14:23.886 "seek_data": false, 00:14:23.886 "copy": false, 00:14:23.886 "nvme_iov_md": false 00:14:23.886 }, 00:14:23.886 "memory_domains": [ 00:14:23.886 { 00:14:23.886 "dma_device_id": "system", 00:14:23.886 "dma_device_type": 1 00:14:23.886 }, 00:14:23.886 { 00:14:23.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.886 "dma_device_type": 2 00:14:23.886 }, 00:14:23.886 { 00:14:23.886 "dma_device_id": "system", 00:14:23.886 "dma_device_type": 1 00:14:23.886 }, 00:14:23.886 { 00:14:23.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.886 "dma_device_type": 2 00:14:23.886 }, 00:14:23.886 { 00:14:23.886 "dma_device_id": "system", 00:14:23.886 "dma_device_type": 1 00:14:23.886 }, 00:14:23.886 { 00:14:23.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.886 "dma_device_type": 2 00:14:23.886 } 00:14:23.886 ], 00:14:23.886 "driver_specific": { 00:14:23.886 "raid": { 00:14:23.886 "uuid": "077aa73e-2a88-4064-bff6-70a1f570c86f", 00:14:23.886 "strip_size_kb": 64, 00:14:23.886 "state": "online", 00:14:23.886 "raid_level": "raid0", 00:14:23.886 "superblock": false, 00:14:23.886 "num_base_bdevs": 3, 00:14:23.886 "num_base_bdevs_discovered": 3, 00:14:23.886 "num_base_bdevs_operational": 3, 00:14:23.886 "base_bdevs_list": [ 00:14:23.886 { 00:14:23.886 "name": "NewBaseBdev", 00:14:23.886 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:23.886 "is_configured": true, 00:14:23.886 "data_offset": 0, 00:14:23.886 "data_size": 65536 00:14:23.886 }, 00:14:23.886 { 00:14:23.886 "name": "BaseBdev2", 00:14:23.886 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:23.886 "is_configured": true, 00:14:23.886 "data_offset": 0, 00:14:23.886 "data_size": 65536 00:14:23.886 }, 00:14:23.886 { 00:14:23.886 "name": "BaseBdev3", 00:14:23.886 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:23.886 "is_configured": true, 00:14:23.886 "data_offset": 0, 00:14:23.886 "data_size": 65536 00:14:23.886 } 00:14:23.886 ] 00:14:23.886 } 00:14:23.886 } 00:14:23.886 }' 00:14:23.886 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:23.886 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:23.886 BaseBdev2 00:14:23.886 BaseBdev3' 00:14:23.886 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:23.886 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:23.886 10:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.143 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.143 "name": "NewBaseBdev", 00:14:24.143 "aliases": [ 00:14:24.143 "3707cdce-1293-4e61-8f2d-002c1756b0ff" 00:14:24.143 ], 00:14:24.143 "product_name": "Malloc disk", 00:14:24.143 "block_size": 512, 00:14:24.143 "num_blocks": 65536, 00:14:24.143 "uuid": "3707cdce-1293-4e61-8f2d-002c1756b0ff", 00:14:24.143 "assigned_rate_limits": { 00:14:24.143 "rw_ios_per_sec": 0, 00:14:24.143 "rw_mbytes_per_sec": 0, 00:14:24.143 "r_mbytes_per_sec": 0, 00:14:24.143 "w_mbytes_per_sec": 0 00:14:24.143 }, 00:14:24.143 "claimed": true, 00:14:24.143 "claim_type": "exclusive_write", 00:14:24.143 "zoned": false, 00:14:24.143 "supported_io_types": { 00:14:24.143 "read": true, 00:14:24.143 "write": true, 00:14:24.143 "unmap": true, 00:14:24.143 "flush": true, 00:14:24.143 "reset": true, 00:14:24.144 "nvme_admin": false, 00:14:24.144 "nvme_io": false, 00:14:24.144 "nvme_io_md": false, 00:14:24.144 "write_zeroes": true, 00:14:24.144 "zcopy": true, 00:14:24.144 "get_zone_info": false, 00:14:24.144 "zone_management": false, 00:14:24.144 "zone_append": false, 00:14:24.144 "compare": false, 00:14:24.144 "compare_and_write": false, 00:14:24.144 "abort": true, 00:14:24.144 "seek_hole": false, 00:14:24.144 "seek_data": false, 00:14:24.144 "copy": true, 00:14:24.144 "nvme_iov_md": false 00:14:24.144 }, 00:14:24.144 "memory_domains": [ 00:14:24.144 { 00:14:24.144 "dma_device_id": "system", 00:14:24.144 "dma_device_type": 1 00:14:24.144 }, 00:14:24.144 { 00:14:24.144 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.144 "dma_device_type": 2 00:14:24.144 } 00:14:24.144 ], 00:14:24.144 "driver_specific": {} 00:14:24.144 }' 00:14:24.144 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.144 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.144 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.144 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.144 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.144 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.144 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:24.401 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.659 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.659 "name": "BaseBdev2", 00:14:24.659 "aliases": [ 00:14:24.659 "5e6f4eb8-eda4-44aa-8f66-be104566d3e4" 00:14:24.659 ], 00:14:24.659 "product_name": "Malloc disk", 00:14:24.659 "block_size": 512, 00:14:24.659 "num_blocks": 65536, 00:14:24.659 "uuid": "5e6f4eb8-eda4-44aa-8f66-be104566d3e4", 00:14:24.659 "assigned_rate_limits": { 00:14:24.659 "rw_ios_per_sec": 0, 00:14:24.659 "rw_mbytes_per_sec": 0, 00:14:24.659 "r_mbytes_per_sec": 0, 00:14:24.659 "w_mbytes_per_sec": 0 00:14:24.659 }, 00:14:24.659 "claimed": true, 00:14:24.659 "claim_type": "exclusive_write", 00:14:24.659 "zoned": false, 00:14:24.659 "supported_io_types": { 00:14:24.659 "read": true, 00:14:24.659 "write": true, 00:14:24.659 "unmap": true, 00:14:24.659 "flush": true, 00:14:24.659 "reset": true, 00:14:24.659 "nvme_admin": false, 00:14:24.659 "nvme_io": false, 00:14:24.659 "nvme_io_md": false, 00:14:24.659 "write_zeroes": true, 00:14:24.659 "zcopy": true, 00:14:24.659 "get_zone_info": false, 00:14:24.659 "zone_management": false, 00:14:24.659 "zone_append": false, 00:14:24.659 "compare": false, 00:14:24.659 "compare_and_write": false, 00:14:24.659 "abort": true, 00:14:24.659 "seek_hole": false, 00:14:24.659 "seek_data": false, 00:14:24.659 "copy": true, 00:14:24.659 "nvme_iov_md": false 00:14:24.659 }, 00:14:24.659 "memory_domains": [ 00:14:24.659 { 00:14:24.659 "dma_device_id": "system", 00:14:24.659 "dma_device_type": 1 00:14:24.659 }, 00:14:24.659 { 00:14:24.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.659 "dma_device_type": 2 00:14:24.659 } 00:14:24.659 ], 00:14:24.659 "driver_specific": {} 00:14:24.659 }' 00:14:24.659 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.659 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.659 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.659 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.659 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.918 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.918 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.918 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.918 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:24.918 10:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.918 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.918 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:24.918 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:24.918 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:24.918 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:25.175 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:25.175 "name": "BaseBdev3", 00:14:25.175 "aliases": [ 00:14:25.175 "0e4ca316-bb52-49da-96fa-2b06130fb224" 00:14:25.175 ], 00:14:25.175 "product_name": "Malloc disk", 00:14:25.175 "block_size": 512, 00:14:25.175 "num_blocks": 65536, 00:14:25.175 "uuid": "0e4ca316-bb52-49da-96fa-2b06130fb224", 00:14:25.175 "assigned_rate_limits": { 00:14:25.175 "rw_ios_per_sec": 0, 00:14:25.175 "rw_mbytes_per_sec": 0, 00:14:25.175 "r_mbytes_per_sec": 0, 00:14:25.175 "w_mbytes_per_sec": 0 00:14:25.175 }, 00:14:25.175 "claimed": true, 00:14:25.175 "claim_type": "exclusive_write", 00:14:25.175 "zoned": false, 00:14:25.175 "supported_io_types": { 00:14:25.175 "read": true, 00:14:25.175 "write": true, 00:14:25.175 "unmap": true, 00:14:25.175 "flush": true, 00:14:25.175 "reset": true, 00:14:25.175 "nvme_admin": false, 00:14:25.175 "nvme_io": false, 00:14:25.175 "nvme_io_md": false, 00:14:25.175 "write_zeroes": true, 00:14:25.175 "zcopy": true, 00:14:25.175 "get_zone_info": false, 00:14:25.175 "zone_management": false, 00:14:25.175 "zone_append": false, 00:14:25.175 "compare": false, 00:14:25.175 "compare_and_write": false, 00:14:25.175 "abort": true, 00:14:25.175 "seek_hole": false, 00:14:25.175 "seek_data": false, 00:14:25.175 "copy": true, 00:14:25.175 "nvme_iov_md": false 00:14:25.175 }, 00:14:25.175 "memory_domains": [ 00:14:25.175 { 00:14:25.175 "dma_device_id": "system", 00:14:25.175 "dma_device_type": 1 00:14:25.175 }, 00:14:25.175 { 00:14:25.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.175 "dma_device_type": 2 00:14:25.175 } 00:14:25.175 ], 00:14:25.175 "driver_specific": {} 00:14:25.175 }' 00:14:25.175 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:25.432 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.691 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.691 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:25.691 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:25.950 [2024-07-15 10:22:02.914987] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:25.950 [2024-07-15 10:22:02.915013] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:25.950 [2024-07-15 10:22:02.915070] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.950 [2024-07-15 10:22:02.915121] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:25.950 [2024-07-15 10:22:02.915134] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e3450 name Existed_Raid, state offline 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 494791 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 494791 ']' 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 494791 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 494791 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 494791' 00:14:25.950 killing process with pid 494791 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 494791 00:14:25.950 [2024-07-15 10:22:02.985524] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:25.950 10:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 494791 00:14:25.950 [2024-07-15 10:22:03.012640] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:26.210 00:14:26.210 real 0m29.312s 00:14:26.210 user 0m53.808s 00:14:26.210 sys 0m5.165s 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.210 ************************************ 00:14:26.210 END TEST raid_state_function_test 00:14:26.210 ************************************ 00:14:26.210 10:22:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:26.210 10:22:03 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:14:26.210 10:22:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:26.210 10:22:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:26.210 10:22:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:26.210 ************************************ 00:14:26.210 START TEST raid_state_function_test_sb 00:14:26.210 ************************************ 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=499259 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 499259' 00:14:26.210 Process raid pid: 499259 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 499259 /var/tmp/spdk-raid.sock 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 499259 ']' 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:26.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:26.210 10:22:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:26.210 [2024-07-15 10:22:03.337203] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:14:26.210 [2024-07-15 10:22:03.337248] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:26.469 [2024-07-15 10:22:03.449476] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:26.469 [2024-07-15 10:22:03.555667] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:26.469 [2024-07-15 10:22:03.616129] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.469 [2024-07-15 10:22:03.616160] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:27.414 10:22:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:27.414 10:22:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:27.414 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:27.674 [2024-07-15 10:22:04.775844] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:27.674 [2024-07-15 10:22:04.775885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:27.674 [2024-07-15 10:22:04.775896] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:27.674 [2024-07-15 10:22:04.775908] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:27.674 [2024-07-15 10:22:04.775917] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:27.674 [2024-07-15 10:22:04.775935] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.674 10:22:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.933 10:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.933 "name": "Existed_Raid", 00:14:27.933 "uuid": "b9b19878-beea-42c9-bc06-0e830609f5d0", 00:14:27.933 "strip_size_kb": 64, 00:14:27.933 "state": "configuring", 00:14:27.933 "raid_level": "raid0", 00:14:27.933 "superblock": true, 00:14:27.934 "num_base_bdevs": 3, 00:14:27.934 "num_base_bdevs_discovered": 0, 00:14:27.934 "num_base_bdevs_operational": 3, 00:14:27.934 "base_bdevs_list": [ 00:14:27.934 { 00:14:27.934 "name": "BaseBdev1", 00:14:27.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.934 "is_configured": false, 00:14:27.934 "data_offset": 0, 00:14:27.934 "data_size": 0 00:14:27.934 }, 00:14:27.934 { 00:14:27.934 "name": "BaseBdev2", 00:14:27.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.934 "is_configured": false, 00:14:27.934 "data_offset": 0, 00:14:27.934 "data_size": 0 00:14:27.934 }, 00:14:27.934 { 00:14:27.934 "name": "BaseBdev3", 00:14:27.934 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.934 "is_configured": false, 00:14:27.934 "data_offset": 0, 00:14:27.934 "data_size": 0 00:14:27.934 } 00:14:27.934 ] 00:14:27.934 }' 00:14:27.934 10:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.934 10:22:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:28.502 10:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:28.760 [2024-07-15 10:22:05.866574] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:28.760 [2024-07-15 10:22:05.866606] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11afa80 name Existed_Raid, state configuring 00:14:28.760 10:22:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:29.019 [2024-07-15 10:22:06.111247] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:29.019 [2024-07-15 10:22:06.111278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:29.019 [2024-07-15 10:22:06.111288] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:29.019 [2024-07-15 10:22:06.111300] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:29.019 [2024-07-15 10:22:06.111309] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:29.019 [2024-07-15 10:22:06.111320] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:29.019 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:29.277 [2024-07-15 10:22:06.361733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:29.277 BaseBdev1 00:14:29.277 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:29.277 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:29.277 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:29.277 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:29.277 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:29.277 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:29.277 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:29.535 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:29.793 [ 00:14:29.793 { 00:14:29.793 "name": "BaseBdev1", 00:14:29.793 "aliases": [ 00:14:29.793 "871f011e-ebec-4693-a236-cb652076bb69" 00:14:29.793 ], 00:14:29.793 "product_name": "Malloc disk", 00:14:29.793 "block_size": 512, 00:14:29.793 "num_blocks": 65536, 00:14:29.793 "uuid": "871f011e-ebec-4693-a236-cb652076bb69", 00:14:29.793 "assigned_rate_limits": { 00:14:29.793 "rw_ios_per_sec": 0, 00:14:29.793 "rw_mbytes_per_sec": 0, 00:14:29.793 "r_mbytes_per_sec": 0, 00:14:29.793 "w_mbytes_per_sec": 0 00:14:29.793 }, 00:14:29.793 "claimed": true, 00:14:29.793 "claim_type": "exclusive_write", 00:14:29.793 "zoned": false, 00:14:29.793 "supported_io_types": { 00:14:29.793 "read": true, 00:14:29.793 "write": true, 00:14:29.793 "unmap": true, 00:14:29.793 "flush": true, 00:14:29.793 "reset": true, 00:14:29.793 "nvme_admin": false, 00:14:29.793 "nvme_io": false, 00:14:29.793 "nvme_io_md": false, 00:14:29.793 "write_zeroes": true, 00:14:29.793 "zcopy": true, 00:14:29.793 "get_zone_info": false, 00:14:29.793 "zone_management": false, 00:14:29.793 "zone_append": false, 00:14:29.793 "compare": false, 00:14:29.793 "compare_and_write": false, 00:14:29.793 "abort": true, 00:14:29.793 "seek_hole": false, 00:14:29.793 "seek_data": false, 00:14:29.793 "copy": true, 00:14:29.793 "nvme_iov_md": false 00:14:29.793 }, 00:14:29.793 "memory_domains": [ 00:14:29.793 { 00:14:29.793 "dma_device_id": "system", 00:14:29.793 "dma_device_type": 1 00:14:29.793 }, 00:14:29.793 { 00:14:29.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.793 "dma_device_type": 2 00:14:29.793 } 00:14:29.793 ], 00:14:29.793 "driver_specific": {} 00:14:29.793 } 00:14:29.793 ] 00:14:29.793 10:22:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:29.793 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:29.793 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:29.793 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:29.793 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.794 10:22:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.053 10:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.053 "name": "Existed_Raid", 00:14:30.053 "uuid": "cdcc8236-55bf-4894-91db-dade8877043a", 00:14:30.053 "strip_size_kb": 64, 00:14:30.053 "state": "configuring", 00:14:30.053 "raid_level": "raid0", 00:14:30.053 "superblock": true, 00:14:30.053 "num_base_bdevs": 3, 00:14:30.053 "num_base_bdevs_discovered": 1, 00:14:30.053 "num_base_bdevs_operational": 3, 00:14:30.053 "base_bdevs_list": [ 00:14:30.053 { 00:14:30.053 "name": "BaseBdev1", 00:14:30.053 "uuid": "871f011e-ebec-4693-a236-cb652076bb69", 00:14:30.053 "is_configured": true, 00:14:30.053 "data_offset": 2048, 00:14:30.053 "data_size": 63488 00:14:30.053 }, 00:14:30.053 { 00:14:30.053 "name": "BaseBdev2", 00:14:30.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.053 "is_configured": false, 00:14:30.053 "data_offset": 0, 00:14:30.053 "data_size": 0 00:14:30.053 }, 00:14:30.053 { 00:14:30.053 "name": "BaseBdev3", 00:14:30.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.053 "is_configured": false, 00:14:30.053 "data_offset": 0, 00:14:30.053 "data_size": 0 00:14:30.053 } 00:14:30.053 ] 00:14:30.053 }' 00:14:30.053 10:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.053 10:22:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:30.619 10:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:30.878 [2024-07-15 10:22:07.921877] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:30.878 [2024-07-15 10:22:07.921917] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11af310 name Existed_Raid, state configuring 00:14:30.878 10:22:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:31.138 [2024-07-15 10:22:08.170577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:31.138 [2024-07-15 10:22:08.172020] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:31.138 [2024-07-15 10:22:08.172051] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:31.138 [2024-07-15 10:22:08.172062] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:31.138 [2024-07-15 10:22:08.172073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.138 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.397 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.397 "name": "Existed_Raid", 00:14:31.397 "uuid": "ff3a66d8-b807-4844-b86e-baa202be386c", 00:14:31.397 "strip_size_kb": 64, 00:14:31.397 "state": "configuring", 00:14:31.397 "raid_level": "raid0", 00:14:31.397 "superblock": true, 00:14:31.397 "num_base_bdevs": 3, 00:14:31.397 "num_base_bdevs_discovered": 1, 00:14:31.397 "num_base_bdevs_operational": 3, 00:14:31.397 "base_bdevs_list": [ 00:14:31.397 { 00:14:31.397 "name": "BaseBdev1", 00:14:31.397 "uuid": "871f011e-ebec-4693-a236-cb652076bb69", 00:14:31.397 "is_configured": true, 00:14:31.397 "data_offset": 2048, 00:14:31.397 "data_size": 63488 00:14:31.397 }, 00:14:31.397 { 00:14:31.397 "name": "BaseBdev2", 00:14:31.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.397 "is_configured": false, 00:14:31.397 "data_offset": 0, 00:14:31.397 "data_size": 0 00:14:31.397 }, 00:14:31.397 { 00:14:31.397 "name": "BaseBdev3", 00:14:31.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.397 "is_configured": false, 00:14:31.397 "data_offset": 0, 00:14:31.397 "data_size": 0 00:14:31.397 } 00:14:31.397 ] 00:14:31.397 }' 00:14:31.397 10:22:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.397 10:22:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.965 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:32.225 [2024-07-15 10:22:09.260891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:32.225 BaseBdev2 00:14:32.225 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:32.225 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:32.225 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:32.225 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:32.225 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:32.225 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:32.225 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:32.484 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:32.743 [ 00:14:32.743 { 00:14:32.743 "name": "BaseBdev2", 00:14:32.743 "aliases": [ 00:14:32.743 "e98e0188-f234-4d6a-8b4d-60931aa957dd" 00:14:32.743 ], 00:14:32.743 "product_name": "Malloc disk", 00:14:32.743 "block_size": 512, 00:14:32.743 "num_blocks": 65536, 00:14:32.743 "uuid": "e98e0188-f234-4d6a-8b4d-60931aa957dd", 00:14:32.743 "assigned_rate_limits": { 00:14:32.743 "rw_ios_per_sec": 0, 00:14:32.743 "rw_mbytes_per_sec": 0, 00:14:32.743 "r_mbytes_per_sec": 0, 00:14:32.743 "w_mbytes_per_sec": 0 00:14:32.743 }, 00:14:32.743 "claimed": true, 00:14:32.743 "claim_type": "exclusive_write", 00:14:32.743 "zoned": false, 00:14:32.743 "supported_io_types": { 00:14:32.743 "read": true, 00:14:32.743 "write": true, 00:14:32.743 "unmap": true, 00:14:32.743 "flush": true, 00:14:32.743 "reset": true, 00:14:32.743 "nvme_admin": false, 00:14:32.743 "nvme_io": false, 00:14:32.743 "nvme_io_md": false, 00:14:32.743 "write_zeroes": true, 00:14:32.743 "zcopy": true, 00:14:32.743 "get_zone_info": false, 00:14:32.743 "zone_management": false, 00:14:32.743 "zone_append": false, 00:14:32.743 "compare": false, 00:14:32.743 "compare_and_write": false, 00:14:32.743 "abort": true, 00:14:32.743 "seek_hole": false, 00:14:32.743 "seek_data": false, 00:14:32.743 "copy": true, 00:14:32.743 "nvme_iov_md": false 00:14:32.743 }, 00:14:32.743 "memory_domains": [ 00:14:32.743 { 00:14:32.743 "dma_device_id": "system", 00:14:32.743 "dma_device_type": 1 00:14:32.743 }, 00:14:32.743 { 00:14:32.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.743 "dma_device_type": 2 00:14:32.743 } 00:14:32.743 ], 00:14:32.743 "driver_specific": {} 00:14:32.743 } 00:14:32.743 ] 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.743 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:33.002 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:33.002 "name": "Existed_Raid", 00:14:33.002 "uuid": "ff3a66d8-b807-4844-b86e-baa202be386c", 00:14:33.002 "strip_size_kb": 64, 00:14:33.002 "state": "configuring", 00:14:33.002 "raid_level": "raid0", 00:14:33.002 "superblock": true, 00:14:33.002 "num_base_bdevs": 3, 00:14:33.002 "num_base_bdevs_discovered": 2, 00:14:33.002 "num_base_bdevs_operational": 3, 00:14:33.002 "base_bdevs_list": [ 00:14:33.002 { 00:14:33.002 "name": "BaseBdev1", 00:14:33.002 "uuid": "871f011e-ebec-4693-a236-cb652076bb69", 00:14:33.002 "is_configured": true, 00:14:33.002 "data_offset": 2048, 00:14:33.002 "data_size": 63488 00:14:33.002 }, 00:14:33.002 { 00:14:33.002 "name": "BaseBdev2", 00:14:33.002 "uuid": "e98e0188-f234-4d6a-8b4d-60931aa957dd", 00:14:33.002 "is_configured": true, 00:14:33.002 "data_offset": 2048, 00:14:33.002 "data_size": 63488 00:14:33.002 }, 00:14:33.002 { 00:14:33.002 "name": "BaseBdev3", 00:14:33.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:33.002 "is_configured": false, 00:14:33.002 "data_offset": 0, 00:14:33.002 "data_size": 0 00:14:33.002 } 00:14:33.002 ] 00:14:33.002 }' 00:14:33.002 10:22:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:33.002 10:22:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:33.570 [2024-07-15 10:22:10.749493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:33.570 [2024-07-15 10:22:10.749659] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11b0400 00:14:33.570 [2024-07-15 10:22:10.749673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:33.570 [2024-07-15 10:22:10.749843] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11afef0 00:14:33.570 [2024-07-15 10:22:10.749967] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11b0400 00:14:33.570 [2024-07-15 10:22:10.749978] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11b0400 00:14:33.570 [2024-07-15 10:22:10.750071] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:33.570 BaseBdev3 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:33.570 10:22:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:33.830 10:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:34.090 [ 00:14:34.090 { 00:14:34.090 "name": "BaseBdev3", 00:14:34.090 "aliases": [ 00:14:34.090 "d90f4a17-2e3d-4468-8c75-c4c31f94744e" 00:14:34.090 ], 00:14:34.090 "product_name": "Malloc disk", 00:14:34.090 "block_size": 512, 00:14:34.090 "num_blocks": 65536, 00:14:34.090 "uuid": "d90f4a17-2e3d-4468-8c75-c4c31f94744e", 00:14:34.090 "assigned_rate_limits": { 00:14:34.090 "rw_ios_per_sec": 0, 00:14:34.090 "rw_mbytes_per_sec": 0, 00:14:34.090 "r_mbytes_per_sec": 0, 00:14:34.090 "w_mbytes_per_sec": 0 00:14:34.090 }, 00:14:34.090 "claimed": true, 00:14:34.090 "claim_type": "exclusive_write", 00:14:34.090 "zoned": false, 00:14:34.090 "supported_io_types": { 00:14:34.090 "read": true, 00:14:34.090 "write": true, 00:14:34.090 "unmap": true, 00:14:34.090 "flush": true, 00:14:34.090 "reset": true, 00:14:34.090 "nvme_admin": false, 00:14:34.090 "nvme_io": false, 00:14:34.090 "nvme_io_md": false, 00:14:34.090 "write_zeroes": true, 00:14:34.090 "zcopy": true, 00:14:34.090 "get_zone_info": false, 00:14:34.090 "zone_management": false, 00:14:34.090 "zone_append": false, 00:14:34.090 "compare": false, 00:14:34.090 "compare_and_write": false, 00:14:34.090 "abort": true, 00:14:34.090 "seek_hole": false, 00:14:34.090 "seek_data": false, 00:14:34.090 "copy": true, 00:14:34.090 "nvme_iov_md": false 00:14:34.090 }, 00:14:34.090 "memory_domains": [ 00:14:34.090 { 00:14:34.090 "dma_device_id": "system", 00:14:34.090 "dma_device_type": 1 00:14:34.090 }, 00:14:34.090 { 00:14:34.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.090 "dma_device_type": 2 00:14:34.090 } 00:14:34.090 ], 00:14:34.090 "driver_specific": {} 00:14:34.090 } 00:14:34.090 ] 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.090 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:34.349 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.349 "name": "Existed_Raid", 00:14:34.349 "uuid": "ff3a66d8-b807-4844-b86e-baa202be386c", 00:14:34.349 "strip_size_kb": 64, 00:14:34.349 "state": "online", 00:14:34.349 "raid_level": "raid0", 00:14:34.349 "superblock": true, 00:14:34.349 "num_base_bdevs": 3, 00:14:34.349 "num_base_bdevs_discovered": 3, 00:14:34.349 "num_base_bdevs_operational": 3, 00:14:34.349 "base_bdevs_list": [ 00:14:34.349 { 00:14:34.349 "name": "BaseBdev1", 00:14:34.349 "uuid": "871f011e-ebec-4693-a236-cb652076bb69", 00:14:34.349 "is_configured": true, 00:14:34.349 "data_offset": 2048, 00:14:34.349 "data_size": 63488 00:14:34.349 }, 00:14:34.349 { 00:14:34.349 "name": "BaseBdev2", 00:14:34.349 "uuid": "e98e0188-f234-4d6a-8b4d-60931aa957dd", 00:14:34.349 "is_configured": true, 00:14:34.349 "data_offset": 2048, 00:14:34.349 "data_size": 63488 00:14:34.349 }, 00:14:34.349 { 00:14:34.349 "name": "BaseBdev3", 00:14:34.349 "uuid": "d90f4a17-2e3d-4468-8c75-c4c31f94744e", 00:14:34.349 "is_configured": true, 00:14:34.349 "data_offset": 2048, 00:14:34.349 "data_size": 63488 00:14:34.349 } 00:14:34.349 ] 00:14:34.349 }' 00:14:34.349 10:22:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.349 10:22:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:34.977 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:34.977 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:35.238 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:35.238 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:35.238 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:35.238 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:35.238 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:35.238 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:35.238 [2024-07-15 10:22:12.350067] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:35.238 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:35.238 "name": "Existed_Raid", 00:14:35.238 "aliases": [ 00:14:35.238 "ff3a66d8-b807-4844-b86e-baa202be386c" 00:14:35.238 ], 00:14:35.238 "product_name": "Raid Volume", 00:14:35.238 "block_size": 512, 00:14:35.238 "num_blocks": 190464, 00:14:35.238 "uuid": "ff3a66d8-b807-4844-b86e-baa202be386c", 00:14:35.238 "assigned_rate_limits": { 00:14:35.238 "rw_ios_per_sec": 0, 00:14:35.238 "rw_mbytes_per_sec": 0, 00:14:35.238 "r_mbytes_per_sec": 0, 00:14:35.238 "w_mbytes_per_sec": 0 00:14:35.238 }, 00:14:35.238 "claimed": false, 00:14:35.238 "zoned": false, 00:14:35.238 "supported_io_types": { 00:14:35.238 "read": true, 00:14:35.238 "write": true, 00:14:35.238 "unmap": true, 00:14:35.238 "flush": true, 00:14:35.238 "reset": true, 00:14:35.238 "nvme_admin": false, 00:14:35.238 "nvme_io": false, 00:14:35.238 "nvme_io_md": false, 00:14:35.238 "write_zeroes": true, 00:14:35.238 "zcopy": false, 00:14:35.238 "get_zone_info": false, 00:14:35.238 "zone_management": false, 00:14:35.238 "zone_append": false, 00:14:35.238 "compare": false, 00:14:35.238 "compare_and_write": false, 00:14:35.238 "abort": false, 00:14:35.238 "seek_hole": false, 00:14:35.238 "seek_data": false, 00:14:35.238 "copy": false, 00:14:35.238 "nvme_iov_md": false 00:14:35.238 }, 00:14:35.238 "memory_domains": [ 00:14:35.238 { 00:14:35.238 "dma_device_id": "system", 00:14:35.238 "dma_device_type": 1 00:14:35.238 }, 00:14:35.238 { 00:14:35.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.238 "dma_device_type": 2 00:14:35.238 }, 00:14:35.238 { 00:14:35.238 "dma_device_id": "system", 00:14:35.238 "dma_device_type": 1 00:14:35.238 }, 00:14:35.238 { 00:14:35.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.238 "dma_device_type": 2 00:14:35.238 }, 00:14:35.238 { 00:14:35.238 "dma_device_id": "system", 00:14:35.238 "dma_device_type": 1 00:14:35.238 }, 00:14:35.238 { 00:14:35.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.238 "dma_device_type": 2 00:14:35.238 } 00:14:35.238 ], 00:14:35.238 "driver_specific": { 00:14:35.238 "raid": { 00:14:35.238 "uuid": "ff3a66d8-b807-4844-b86e-baa202be386c", 00:14:35.238 "strip_size_kb": 64, 00:14:35.238 "state": "online", 00:14:35.238 "raid_level": "raid0", 00:14:35.239 "superblock": true, 00:14:35.239 "num_base_bdevs": 3, 00:14:35.239 "num_base_bdevs_discovered": 3, 00:14:35.239 "num_base_bdevs_operational": 3, 00:14:35.239 "base_bdevs_list": [ 00:14:35.239 { 00:14:35.239 "name": "BaseBdev1", 00:14:35.239 "uuid": "871f011e-ebec-4693-a236-cb652076bb69", 00:14:35.239 "is_configured": true, 00:14:35.239 "data_offset": 2048, 00:14:35.239 "data_size": 63488 00:14:35.239 }, 00:14:35.239 { 00:14:35.239 "name": "BaseBdev2", 00:14:35.239 "uuid": "e98e0188-f234-4d6a-8b4d-60931aa957dd", 00:14:35.239 "is_configured": true, 00:14:35.239 "data_offset": 2048, 00:14:35.239 "data_size": 63488 00:14:35.239 }, 00:14:35.239 { 00:14:35.239 "name": "BaseBdev3", 00:14:35.239 "uuid": "d90f4a17-2e3d-4468-8c75-c4c31f94744e", 00:14:35.239 "is_configured": true, 00:14:35.239 "data_offset": 2048, 00:14:35.239 "data_size": 63488 00:14:35.239 } 00:14:35.239 ] 00:14:35.239 } 00:14:35.239 } 00:14:35.239 }' 00:14:35.239 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:35.239 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:35.239 BaseBdev2 00:14:35.239 BaseBdev3' 00:14:35.239 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:35.239 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:35.239 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:35.498 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:35.498 "name": "BaseBdev1", 00:14:35.498 "aliases": [ 00:14:35.498 "871f011e-ebec-4693-a236-cb652076bb69" 00:14:35.498 ], 00:14:35.498 "product_name": "Malloc disk", 00:14:35.498 "block_size": 512, 00:14:35.498 "num_blocks": 65536, 00:14:35.498 "uuid": "871f011e-ebec-4693-a236-cb652076bb69", 00:14:35.498 "assigned_rate_limits": { 00:14:35.498 "rw_ios_per_sec": 0, 00:14:35.498 "rw_mbytes_per_sec": 0, 00:14:35.498 "r_mbytes_per_sec": 0, 00:14:35.498 "w_mbytes_per_sec": 0 00:14:35.498 }, 00:14:35.498 "claimed": true, 00:14:35.498 "claim_type": "exclusive_write", 00:14:35.498 "zoned": false, 00:14:35.498 "supported_io_types": { 00:14:35.498 "read": true, 00:14:35.498 "write": true, 00:14:35.498 "unmap": true, 00:14:35.498 "flush": true, 00:14:35.498 "reset": true, 00:14:35.498 "nvme_admin": false, 00:14:35.498 "nvme_io": false, 00:14:35.498 "nvme_io_md": false, 00:14:35.498 "write_zeroes": true, 00:14:35.498 "zcopy": true, 00:14:35.498 "get_zone_info": false, 00:14:35.498 "zone_management": false, 00:14:35.498 "zone_append": false, 00:14:35.498 "compare": false, 00:14:35.498 "compare_and_write": false, 00:14:35.498 "abort": true, 00:14:35.498 "seek_hole": false, 00:14:35.498 "seek_data": false, 00:14:35.498 "copy": true, 00:14:35.498 "nvme_iov_md": false 00:14:35.498 }, 00:14:35.498 "memory_domains": [ 00:14:35.498 { 00:14:35.498 "dma_device_id": "system", 00:14:35.498 "dma_device_type": 1 00:14:35.498 }, 00:14:35.498 { 00:14:35.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.498 "dma_device_type": 2 00:14:35.498 } 00:14:35.498 ], 00:14:35.498 "driver_specific": {} 00:14:35.498 }' 00:14:35.498 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.498 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:35.756 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:35.756 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:35.756 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:35.756 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:35.756 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:35.756 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:36.015 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:36.015 10:22:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.015 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.015 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:36.015 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:36.015 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:36.015 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:36.273 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:36.273 "name": "BaseBdev2", 00:14:36.273 "aliases": [ 00:14:36.273 "e98e0188-f234-4d6a-8b4d-60931aa957dd" 00:14:36.273 ], 00:14:36.273 "product_name": "Malloc disk", 00:14:36.273 "block_size": 512, 00:14:36.273 "num_blocks": 65536, 00:14:36.273 "uuid": "e98e0188-f234-4d6a-8b4d-60931aa957dd", 00:14:36.273 "assigned_rate_limits": { 00:14:36.273 "rw_ios_per_sec": 0, 00:14:36.273 "rw_mbytes_per_sec": 0, 00:14:36.273 "r_mbytes_per_sec": 0, 00:14:36.273 "w_mbytes_per_sec": 0 00:14:36.273 }, 00:14:36.273 "claimed": true, 00:14:36.273 "claim_type": "exclusive_write", 00:14:36.273 "zoned": false, 00:14:36.273 "supported_io_types": { 00:14:36.273 "read": true, 00:14:36.273 "write": true, 00:14:36.273 "unmap": true, 00:14:36.273 "flush": true, 00:14:36.273 "reset": true, 00:14:36.273 "nvme_admin": false, 00:14:36.273 "nvme_io": false, 00:14:36.273 "nvme_io_md": false, 00:14:36.273 "write_zeroes": true, 00:14:36.273 "zcopy": true, 00:14:36.273 "get_zone_info": false, 00:14:36.273 "zone_management": false, 00:14:36.273 "zone_append": false, 00:14:36.273 "compare": false, 00:14:36.273 "compare_and_write": false, 00:14:36.273 "abort": true, 00:14:36.273 "seek_hole": false, 00:14:36.273 "seek_data": false, 00:14:36.273 "copy": true, 00:14:36.273 "nvme_iov_md": false 00:14:36.273 }, 00:14:36.273 "memory_domains": [ 00:14:36.273 { 00:14:36.273 "dma_device_id": "system", 00:14:36.273 "dma_device_type": 1 00:14:36.273 }, 00:14:36.273 { 00:14:36.273 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.273 "dma_device_type": 2 00:14:36.273 } 00:14:36.273 ], 00:14:36.273 "driver_specific": {} 00:14:36.273 }' 00:14:36.273 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:36.273 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:36.273 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:36.273 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:36.273 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:36.531 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:36.789 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:36.789 "name": "BaseBdev3", 00:14:36.789 "aliases": [ 00:14:36.789 "d90f4a17-2e3d-4468-8c75-c4c31f94744e" 00:14:36.789 ], 00:14:36.789 "product_name": "Malloc disk", 00:14:36.789 "block_size": 512, 00:14:36.789 "num_blocks": 65536, 00:14:36.789 "uuid": "d90f4a17-2e3d-4468-8c75-c4c31f94744e", 00:14:36.789 "assigned_rate_limits": { 00:14:36.789 "rw_ios_per_sec": 0, 00:14:36.789 "rw_mbytes_per_sec": 0, 00:14:36.789 "r_mbytes_per_sec": 0, 00:14:36.789 "w_mbytes_per_sec": 0 00:14:36.789 }, 00:14:36.789 "claimed": true, 00:14:36.789 "claim_type": "exclusive_write", 00:14:36.789 "zoned": false, 00:14:36.789 "supported_io_types": { 00:14:36.789 "read": true, 00:14:36.789 "write": true, 00:14:36.789 "unmap": true, 00:14:36.789 "flush": true, 00:14:36.789 "reset": true, 00:14:36.789 "nvme_admin": false, 00:14:36.789 "nvme_io": false, 00:14:36.789 "nvme_io_md": false, 00:14:36.789 "write_zeroes": true, 00:14:36.789 "zcopy": true, 00:14:36.789 "get_zone_info": false, 00:14:36.789 "zone_management": false, 00:14:36.789 "zone_append": false, 00:14:36.789 "compare": false, 00:14:36.789 "compare_and_write": false, 00:14:36.790 "abort": true, 00:14:36.790 "seek_hole": false, 00:14:36.790 "seek_data": false, 00:14:36.790 "copy": true, 00:14:36.790 "nvme_iov_md": false 00:14:36.790 }, 00:14:36.790 "memory_domains": [ 00:14:36.790 { 00:14:36.790 "dma_device_id": "system", 00:14:36.790 "dma_device_type": 1 00:14:36.790 }, 00:14:36.790 { 00:14:36.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:36.790 "dma_device_type": 2 00:14:36.790 } 00:14:36.790 ], 00:14:36.790 "driver_specific": {} 00:14:36.790 }' 00:14:36.790 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:36.790 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:37.048 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:37.048 10:22:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.048 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:37.048 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:37.048 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.048 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:37.048 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:37.048 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.048 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:37.307 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:37.307 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:37.307 [2024-07-15 10:22:14.487496] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:37.307 [2024-07-15 10:22:14.487526] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:37.307 [2024-07-15 10:22:14.487569] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.566 "name": "Existed_Raid", 00:14:37.566 "uuid": "ff3a66d8-b807-4844-b86e-baa202be386c", 00:14:37.566 "strip_size_kb": 64, 00:14:37.566 "state": "offline", 00:14:37.566 "raid_level": "raid0", 00:14:37.566 "superblock": true, 00:14:37.566 "num_base_bdevs": 3, 00:14:37.566 "num_base_bdevs_discovered": 2, 00:14:37.566 "num_base_bdevs_operational": 2, 00:14:37.566 "base_bdevs_list": [ 00:14:37.566 { 00:14:37.566 "name": null, 00:14:37.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.566 "is_configured": false, 00:14:37.566 "data_offset": 2048, 00:14:37.566 "data_size": 63488 00:14:37.566 }, 00:14:37.566 { 00:14:37.566 "name": "BaseBdev2", 00:14:37.566 "uuid": "e98e0188-f234-4d6a-8b4d-60931aa957dd", 00:14:37.566 "is_configured": true, 00:14:37.566 "data_offset": 2048, 00:14:37.566 "data_size": 63488 00:14:37.566 }, 00:14:37.566 { 00:14:37.566 "name": "BaseBdev3", 00:14:37.566 "uuid": "d90f4a17-2e3d-4468-8c75-c4c31f94744e", 00:14:37.566 "is_configured": true, 00:14:37.566 "data_offset": 2048, 00:14:37.566 "data_size": 63488 00:14:37.566 } 00:14:37.566 ] 00:14:37.566 }' 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.566 10:22:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:38.134 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:38.134 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:38.134 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.134 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:38.393 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:38.393 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:38.393 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:38.652 [2024-07-15 10:22:15.760965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:38.652 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:38.652 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:38.652 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.652 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:38.910 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:38.910 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:38.910 10:22:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:39.169 [2024-07-15 10:22:16.114382] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:39.169 [2024-07-15 10:22:16.114429] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b0400 name Existed_Raid, state offline 00:14:39.169 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:39.169 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:39.169 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.169 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:39.427 BaseBdev2 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:39.427 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.685 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:39.943 [ 00:14:39.943 { 00:14:39.943 "name": "BaseBdev2", 00:14:39.943 "aliases": [ 00:14:39.943 "f61d7844-0569-41b7-b499-66cd4616b018" 00:14:39.943 ], 00:14:39.943 "product_name": "Malloc disk", 00:14:39.943 "block_size": 512, 00:14:39.943 "num_blocks": 65536, 00:14:39.943 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:39.943 "assigned_rate_limits": { 00:14:39.943 "rw_ios_per_sec": 0, 00:14:39.943 "rw_mbytes_per_sec": 0, 00:14:39.943 "r_mbytes_per_sec": 0, 00:14:39.943 "w_mbytes_per_sec": 0 00:14:39.943 }, 00:14:39.943 "claimed": false, 00:14:39.943 "zoned": false, 00:14:39.943 "supported_io_types": { 00:14:39.943 "read": true, 00:14:39.943 "write": true, 00:14:39.943 "unmap": true, 00:14:39.943 "flush": true, 00:14:39.943 "reset": true, 00:14:39.943 "nvme_admin": false, 00:14:39.943 "nvme_io": false, 00:14:39.943 "nvme_io_md": false, 00:14:39.943 "write_zeroes": true, 00:14:39.943 "zcopy": true, 00:14:39.943 "get_zone_info": false, 00:14:39.943 "zone_management": false, 00:14:39.943 "zone_append": false, 00:14:39.943 "compare": false, 00:14:39.943 "compare_and_write": false, 00:14:39.943 "abort": true, 00:14:39.943 "seek_hole": false, 00:14:39.943 "seek_data": false, 00:14:39.943 "copy": true, 00:14:39.943 "nvme_iov_md": false 00:14:39.943 }, 00:14:39.943 "memory_domains": [ 00:14:39.943 { 00:14:39.943 "dma_device_id": "system", 00:14:39.943 "dma_device_type": 1 00:14:39.943 }, 00:14:39.943 { 00:14:39.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.943 "dma_device_type": 2 00:14:39.943 } 00:14:39.943 ], 00:14:39.943 "driver_specific": {} 00:14:39.943 } 00:14:39.943 ] 00:14:39.944 10:22:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:39.944 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:39.944 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:39.944 10:22:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:40.203 BaseBdev3 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:40.203 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:40.462 [ 00:14:40.462 { 00:14:40.462 "name": "BaseBdev3", 00:14:40.462 "aliases": [ 00:14:40.462 "2ca94c0b-4963-4482-9a56-8bda7cbf2290" 00:14:40.462 ], 00:14:40.462 "product_name": "Malloc disk", 00:14:40.462 "block_size": 512, 00:14:40.462 "num_blocks": 65536, 00:14:40.462 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:40.462 "assigned_rate_limits": { 00:14:40.462 "rw_ios_per_sec": 0, 00:14:40.462 "rw_mbytes_per_sec": 0, 00:14:40.462 "r_mbytes_per_sec": 0, 00:14:40.462 "w_mbytes_per_sec": 0 00:14:40.462 }, 00:14:40.462 "claimed": false, 00:14:40.462 "zoned": false, 00:14:40.462 "supported_io_types": { 00:14:40.462 "read": true, 00:14:40.462 "write": true, 00:14:40.462 "unmap": true, 00:14:40.462 "flush": true, 00:14:40.462 "reset": true, 00:14:40.462 "nvme_admin": false, 00:14:40.462 "nvme_io": false, 00:14:40.462 "nvme_io_md": false, 00:14:40.462 "write_zeroes": true, 00:14:40.462 "zcopy": true, 00:14:40.462 "get_zone_info": false, 00:14:40.462 "zone_management": false, 00:14:40.462 "zone_append": false, 00:14:40.462 "compare": false, 00:14:40.462 "compare_and_write": false, 00:14:40.462 "abort": true, 00:14:40.462 "seek_hole": false, 00:14:40.462 "seek_data": false, 00:14:40.462 "copy": true, 00:14:40.462 "nvme_iov_md": false 00:14:40.462 }, 00:14:40.462 "memory_domains": [ 00:14:40.462 { 00:14:40.462 "dma_device_id": "system", 00:14:40.462 "dma_device_type": 1 00:14:40.462 }, 00:14:40.462 { 00:14:40.462 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.462 "dma_device_type": 2 00:14:40.462 } 00:14:40.462 ], 00:14:40.462 "driver_specific": {} 00:14:40.462 } 00:14:40.462 ] 00:14:40.462 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:40.462 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:40.462 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:40.462 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:40.722 [2024-07-15 10:22:17.707459] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:40.722 [2024-07-15 10:22:17.707501] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:40.722 [2024-07-15 10:22:17.707521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:40.722 [2024-07-15 10:22:17.708849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.722 "name": "Existed_Raid", 00:14:40.722 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:40.722 "strip_size_kb": 64, 00:14:40.722 "state": "configuring", 00:14:40.722 "raid_level": "raid0", 00:14:40.722 "superblock": true, 00:14:40.722 "num_base_bdevs": 3, 00:14:40.722 "num_base_bdevs_discovered": 2, 00:14:40.722 "num_base_bdevs_operational": 3, 00:14:40.722 "base_bdevs_list": [ 00:14:40.722 { 00:14:40.722 "name": "BaseBdev1", 00:14:40.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.722 "is_configured": false, 00:14:40.722 "data_offset": 0, 00:14:40.722 "data_size": 0 00:14:40.722 }, 00:14:40.722 { 00:14:40.722 "name": "BaseBdev2", 00:14:40.722 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:40.722 "is_configured": true, 00:14:40.722 "data_offset": 2048, 00:14:40.722 "data_size": 63488 00:14:40.722 }, 00:14:40.722 { 00:14:40.722 "name": "BaseBdev3", 00:14:40.722 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:40.722 "is_configured": true, 00:14:40.722 "data_offset": 2048, 00:14:40.722 "data_size": 63488 00:14:40.722 } 00:14:40.722 ] 00:14:40.722 }' 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.722 10:22:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:41.290 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:41.547 [2024-07-15 10:22:18.682015] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.547 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.805 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.805 "name": "Existed_Raid", 00:14:41.805 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:41.805 "strip_size_kb": 64, 00:14:41.805 "state": "configuring", 00:14:41.805 "raid_level": "raid0", 00:14:41.805 "superblock": true, 00:14:41.805 "num_base_bdevs": 3, 00:14:41.805 "num_base_bdevs_discovered": 1, 00:14:41.805 "num_base_bdevs_operational": 3, 00:14:41.805 "base_bdevs_list": [ 00:14:41.805 { 00:14:41.805 "name": "BaseBdev1", 00:14:41.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.805 "is_configured": false, 00:14:41.805 "data_offset": 0, 00:14:41.805 "data_size": 0 00:14:41.805 }, 00:14:41.805 { 00:14:41.805 "name": null, 00:14:41.805 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:41.805 "is_configured": false, 00:14:41.805 "data_offset": 2048, 00:14:41.805 "data_size": 63488 00:14:41.805 }, 00:14:41.805 { 00:14:41.805 "name": "BaseBdev3", 00:14:41.805 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:41.805 "is_configured": true, 00:14:41.805 "data_offset": 2048, 00:14:41.805 "data_size": 63488 00:14:41.805 } 00:14:41.805 ] 00:14:41.805 }' 00:14:41.805 10:22:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.805 10:22:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:42.371 10:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.371 10:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:42.629 10:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:42.629 10:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:42.887 [2024-07-15 10:22:19.944794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:42.887 BaseBdev1 00:14:42.887 10:22:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:42.887 10:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:42.887 10:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:42.887 10:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:42.887 10:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:42.887 10:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:42.887 10:22:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:43.145 10:22:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:43.709 [ 00:14:43.709 { 00:14:43.709 "name": "BaseBdev1", 00:14:43.709 "aliases": [ 00:14:43.709 "be01a01c-a8f9-43b7-987f-4b6851a933b7" 00:14:43.709 ], 00:14:43.709 "product_name": "Malloc disk", 00:14:43.709 "block_size": 512, 00:14:43.709 "num_blocks": 65536, 00:14:43.709 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:43.709 "assigned_rate_limits": { 00:14:43.709 "rw_ios_per_sec": 0, 00:14:43.709 "rw_mbytes_per_sec": 0, 00:14:43.709 "r_mbytes_per_sec": 0, 00:14:43.709 "w_mbytes_per_sec": 0 00:14:43.709 }, 00:14:43.709 "claimed": true, 00:14:43.709 "claim_type": "exclusive_write", 00:14:43.709 "zoned": false, 00:14:43.709 "supported_io_types": { 00:14:43.709 "read": true, 00:14:43.709 "write": true, 00:14:43.709 "unmap": true, 00:14:43.709 "flush": true, 00:14:43.709 "reset": true, 00:14:43.709 "nvme_admin": false, 00:14:43.709 "nvme_io": false, 00:14:43.709 "nvme_io_md": false, 00:14:43.709 "write_zeroes": true, 00:14:43.709 "zcopy": true, 00:14:43.709 "get_zone_info": false, 00:14:43.709 "zone_management": false, 00:14:43.709 "zone_append": false, 00:14:43.709 "compare": false, 00:14:43.709 "compare_and_write": false, 00:14:43.709 "abort": true, 00:14:43.709 "seek_hole": false, 00:14:43.709 "seek_data": false, 00:14:43.709 "copy": true, 00:14:43.709 "nvme_iov_md": false 00:14:43.709 }, 00:14:43.709 "memory_domains": [ 00:14:43.709 { 00:14:43.709 "dma_device_id": "system", 00:14:43.709 "dma_device_type": 1 00:14:43.709 }, 00:14:43.709 { 00:14:43.709 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.709 "dma_device_type": 2 00:14:43.709 } 00:14:43.709 ], 00:14:43.709 "driver_specific": {} 00:14:43.709 } 00:14:43.709 ] 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.709 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.710 10:22:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.273 10:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.273 "name": "Existed_Raid", 00:14:44.273 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:44.273 "strip_size_kb": 64, 00:14:44.273 "state": "configuring", 00:14:44.273 "raid_level": "raid0", 00:14:44.273 "superblock": true, 00:14:44.273 "num_base_bdevs": 3, 00:14:44.273 "num_base_bdevs_discovered": 2, 00:14:44.273 "num_base_bdevs_operational": 3, 00:14:44.273 "base_bdevs_list": [ 00:14:44.273 { 00:14:44.273 "name": "BaseBdev1", 00:14:44.273 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:44.273 "is_configured": true, 00:14:44.273 "data_offset": 2048, 00:14:44.273 "data_size": 63488 00:14:44.273 }, 00:14:44.273 { 00:14:44.273 "name": null, 00:14:44.273 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:44.273 "is_configured": false, 00:14:44.273 "data_offset": 2048, 00:14:44.273 "data_size": 63488 00:14:44.273 }, 00:14:44.273 { 00:14:44.273 "name": "BaseBdev3", 00:14:44.273 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:44.273 "is_configured": true, 00:14:44.273 "data_offset": 2048, 00:14:44.273 "data_size": 63488 00:14:44.273 } 00:14:44.273 ] 00:14:44.273 }' 00:14:44.273 10:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.273 10:22:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.836 10:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.836 10:22:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:45.094 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:45.094 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:45.352 [2024-07-15 10:22:22.547755] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.610 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.868 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.868 "name": "Existed_Raid", 00:14:45.868 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:45.868 "strip_size_kb": 64, 00:14:45.868 "state": "configuring", 00:14:45.868 "raid_level": "raid0", 00:14:45.868 "superblock": true, 00:14:45.868 "num_base_bdevs": 3, 00:14:45.868 "num_base_bdevs_discovered": 1, 00:14:45.868 "num_base_bdevs_operational": 3, 00:14:45.868 "base_bdevs_list": [ 00:14:45.868 { 00:14:45.868 "name": "BaseBdev1", 00:14:45.868 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:45.868 "is_configured": true, 00:14:45.868 "data_offset": 2048, 00:14:45.868 "data_size": 63488 00:14:45.868 }, 00:14:45.868 { 00:14:45.868 "name": null, 00:14:45.868 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:45.868 "is_configured": false, 00:14:45.868 "data_offset": 2048, 00:14:45.868 "data_size": 63488 00:14:45.868 }, 00:14:45.868 { 00:14:45.868 "name": null, 00:14:45.868 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:45.868 "is_configured": false, 00:14:45.868 "data_offset": 2048, 00:14:45.868 "data_size": 63488 00:14:45.868 } 00:14:45.868 ] 00:14:45.868 }' 00:14:45.868 10:22:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.868 10:22:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:46.435 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.435 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:46.693 [2024-07-15 10:22:23.871325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.693 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.951 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.951 10:22:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.951 10:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.951 "name": "Existed_Raid", 00:14:46.951 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:46.951 "strip_size_kb": 64, 00:14:46.951 "state": "configuring", 00:14:46.951 "raid_level": "raid0", 00:14:46.951 "superblock": true, 00:14:46.951 "num_base_bdevs": 3, 00:14:46.951 "num_base_bdevs_discovered": 2, 00:14:46.951 "num_base_bdevs_operational": 3, 00:14:46.951 "base_bdevs_list": [ 00:14:46.951 { 00:14:46.951 "name": "BaseBdev1", 00:14:46.951 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:46.951 "is_configured": true, 00:14:46.951 "data_offset": 2048, 00:14:46.951 "data_size": 63488 00:14:46.951 }, 00:14:46.951 { 00:14:46.951 "name": null, 00:14:46.951 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:46.951 "is_configured": false, 00:14:46.951 "data_offset": 2048, 00:14:46.951 "data_size": 63488 00:14:46.951 }, 00:14:46.951 { 00:14:46.951 "name": "BaseBdev3", 00:14:46.951 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:46.951 "is_configured": true, 00:14:46.951 "data_offset": 2048, 00:14:46.951 "data_size": 63488 00:14:46.951 } 00:14:46.951 ] 00:14:46.951 }' 00:14:46.951 10:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.951 10:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.518 10:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:47.518 10:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.776 10:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:47.776 10:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:48.034 [2024-07-15 10:22:25.150731] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.034 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.293 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:48.293 "name": "Existed_Raid", 00:14:48.293 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:48.293 "strip_size_kb": 64, 00:14:48.293 "state": "configuring", 00:14:48.293 "raid_level": "raid0", 00:14:48.293 "superblock": true, 00:14:48.293 "num_base_bdevs": 3, 00:14:48.293 "num_base_bdevs_discovered": 1, 00:14:48.293 "num_base_bdevs_operational": 3, 00:14:48.293 "base_bdevs_list": [ 00:14:48.293 { 00:14:48.293 "name": null, 00:14:48.293 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:48.293 "is_configured": false, 00:14:48.293 "data_offset": 2048, 00:14:48.293 "data_size": 63488 00:14:48.293 }, 00:14:48.293 { 00:14:48.293 "name": null, 00:14:48.293 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:48.293 "is_configured": false, 00:14:48.293 "data_offset": 2048, 00:14:48.293 "data_size": 63488 00:14:48.293 }, 00:14:48.293 { 00:14:48.293 "name": "BaseBdev3", 00:14:48.293 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:48.293 "is_configured": true, 00:14:48.293 "data_offset": 2048, 00:14:48.293 "data_size": 63488 00:14:48.293 } 00:14:48.293 ] 00:14:48.293 }' 00:14:48.293 10:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:48.293 10:22:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.860 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:48.860 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.164 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:49.164 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:49.423 [2024-07-15 10:22:26.506769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.423 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.682 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.682 "name": "Existed_Raid", 00:14:49.682 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:49.682 "strip_size_kb": 64, 00:14:49.682 "state": "configuring", 00:14:49.682 "raid_level": "raid0", 00:14:49.682 "superblock": true, 00:14:49.682 "num_base_bdevs": 3, 00:14:49.682 "num_base_bdevs_discovered": 2, 00:14:49.682 "num_base_bdevs_operational": 3, 00:14:49.682 "base_bdevs_list": [ 00:14:49.682 { 00:14:49.682 "name": null, 00:14:49.682 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:49.682 "is_configured": false, 00:14:49.682 "data_offset": 2048, 00:14:49.682 "data_size": 63488 00:14:49.682 }, 00:14:49.682 { 00:14:49.682 "name": "BaseBdev2", 00:14:49.682 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:49.682 "is_configured": true, 00:14:49.682 "data_offset": 2048, 00:14:49.682 "data_size": 63488 00:14:49.682 }, 00:14:49.682 { 00:14:49.682 "name": "BaseBdev3", 00:14:49.682 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:49.682 "is_configured": true, 00:14:49.682 "data_offset": 2048, 00:14:49.682 "data_size": 63488 00:14:49.682 } 00:14:49.682 ] 00:14:49.682 }' 00:14:49.682 10:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.682 10:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:50.250 10:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.250 10:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:50.509 10:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:50.509 10:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.509 10:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:50.767 10:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u be01a01c-a8f9-43b7-987f-4b6851a933b7 00:14:51.025 [2024-07-15 10:22:28.102440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:51.025 [2024-07-15 10:22:28.102587] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11aee90 00:14:51.025 [2024-07-15 10:22:28.102601] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:51.025 [2024-07-15 10:22:28.102776] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xeb5940 00:14:51.025 [2024-07-15 10:22:28.102890] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11aee90 00:14:51.025 [2024-07-15 10:22:28.102901] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11aee90 00:14:51.025 [2024-07-15 10:22:28.103003] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:51.025 NewBaseBdev 00:14:51.025 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:51.025 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:51.025 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:51.025 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:51.025 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:51.025 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:51.025 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:51.283 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:51.542 [ 00:14:51.542 { 00:14:51.542 "name": "NewBaseBdev", 00:14:51.542 "aliases": [ 00:14:51.542 "be01a01c-a8f9-43b7-987f-4b6851a933b7" 00:14:51.542 ], 00:14:51.542 "product_name": "Malloc disk", 00:14:51.542 "block_size": 512, 00:14:51.542 "num_blocks": 65536, 00:14:51.542 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:51.542 "assigned_rate_limits": { 00:14:51.542 "rw_ios_per_sec": 0, 00:14:51.542 "rw_mbytes_per_sec": 0, 00:14:51.542 "r_mbytes_per_sec": 0, 00:14:51.542 "w_mbytes_per_sec": 0 00:14:51.542 }, 00:14:51.542 "claimed": true, 00:14:51.542 "claim_type": "exclusive_write", 00:14:51.542 "zoned": false, 00:14:51.542 "supported_io_types": { 00:14:51.542 "read": true, 00:14:51.542 "write": true, 00:14:51.542 "unmap": true, 00:14:51.542 "flush": true, 00:14:51.542 "reset": true, 00:14:51.542 "nvme_admin": false, 00:14:51.542 "nvme_io": false, 00:14:51.542 "nvme_io_md": false, 00:14:51.542 "write_zeroes": true, 00:14:51.542 "zcopy": true, 00:14:51.542 "get_zone_info": false, 00:14:51.542 "zone_management": false, 00:14:51.542 "zone_append": false, 00:14:51.542 "compare": false, 00:14:51.542 "compare_and_write": false, 00:14:51.542 "abort": true, 00:14:51.542 "seek_hole": false, 00:14:51.542 "seek_data": false, 00:14:51.542 "copy": true, 00:14:51.542 "nvme_iov_md": false 00:14:51.542 }, 00:14:51.542 "memory_domains": [ 00:14:51.542 { 00:14:51.542 "dma_device_id": "system", 00:14:51.542 "dma_device_type": 1 00:14:51.542 }, 00:14:51.542 { 00:14:51.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.542 "dma_device_type": 2 00:14:51.542 } 00:14:51.542 ], 00:14:51.542 "driver_specific": {} 00:14:51.542 } 00:14:51.542 ] 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.542 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.801 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.801 "name": "Existed_Raid", 00:14:51.801 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:51.801 "strip_size_kb": 64, 00:14:51.801 "state": "online", 00:14:51.801 "raid_level": "raid0", 00:14:51.801 "superblock": true, 00:14:51.801 "num_base_bdevs": 3, 00:14:51.801 "num_base_bdevs_discovered": 3, 00:14:51.801 "num_base_bdevs_operational": 3, 00:14:51.801 "base_bdevs_list": [ 00:14:51.801 { 00:14:51.801 "name": "NewBaseBdev", 00:14:51.801 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:51.801 "is_configured": true, 00:14:51.801 "data_offset": 2048, 00:14:51.801 "data_size": 63488 00:14:51.801 }, 00:14:51.801 { 00:14:51.801 "name": "BaseBdev2", 00:14:51.801 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:51.801 "is_configured": true, 00:14:51.801 "data_offset": 2048, 00:14:51.801 "data_size": 63488 00:14:51.801 }, 00:14:51.801 { 00:14:51.801 "name": "BaseBdev3", 00:14:51.801 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:51.801 "is_configured": true, 00:14:51.801 "data_offset": 2048, 00:14:51.801 "data_size": 63488 00:14:51.801 } 00:14:51.801 ] 00:14:51.801 }' 00:14:51.801 10:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.801 10:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:52.367 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:52.626 [2024-07-15 10:22:29.610748] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:52.626 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:52.627 "name": "Existed_Raid", 00:14:52.627 "aliases": [ 00:14:52.627 "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93" 00:14:52.627 ], 00:14:52.627 "product_name": "Raid Volume", 00:14:52.627 "block_size": 512, 00:14:52.627 "num_blocks": 190464, 00:14:52.627 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:52.627 "assigned_rate_limits": { 00:14:52.627 "rw_ios_per_sec": 0, 00:14:52.627 "rw_mbytes_per_sec": 0, 00:14:52.627 "r_mbytes_per_sec": 0, 00:14:52.627 "w_mbytes_per_sec": 0 00:14:52.627 }, 00:14:52.627 "claimed": false, 00:14:52.627 "zoned": false, 00:14:52.627 "supported_io_types": { 00:14:52.627 "read": true, 00:14:52.627 "write": true, 00:14:52.627 "unmap": true, 00:14:52.627 "flush": true, 00:14:52.627 "reset": true, 00:14:52.627 "nvme_admin": false, 00:14:52.627 "nvme_io": false, 00:14:52.627 "nvme_io_md": false, 00:14:52.627 "write_zeroes": true, 00:14:52.627 "zcopy": false, 00:14:52.627 "get_zone_info": false, 00:14:52.627 "zone_management": false, 00:14:52.627 "zone_append": false, 00:14:52.627 "compare": false, 00:14:52.627 "compare_and_write": false, 00:14:52.627 "abort": false, 00:14:52.627 "seek_hole": false, 00:14:52.627 "seek_data": false, 00:14:52.627 "copy": false, 00:14:52.627 "nvme_iov_md": false 00:14:52.627 }, 00:14:52.627 "memory_domains": [ 00:14:52.627 { 00:14:52.627 "dma_device_id": "system", 00:14:52.627 "dma_device_type": 1 00:14:52.627 }, 00:14:52.627 { 00:14:52.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.627 "dma_device_type": 2 00:14:52.627 }, 00:14:52.627 { 00:14:52.627 "dma_device_id": "system", 00:14:52.627 "dma_device_type": 1 00:14:52.627 }, 00:14:52.627 { 00:14:52.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.627 "dma_device_type": 2 00:14:52.627 }, 00:14:52.627 { 00:14:52.627 "dma_device_id": "system", 00:14:52.627 "dma_device_type": 1 00:14:52.627 }, 00:14:52.627 { 00:14:52.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.627 "dma_device_type": 2 00:14:52.627 } 00:14:52.627 ], 00:14:52.627 "driver_specific": { 00:14:52.627 "raid": { 00:14:52.627 "uuid": "dee54d29-f4b9-4f1c-86ca-6f1cc8fa5c93", 00:14:52.627 "strip_size_kb": 64, 00:14:52.627 "state": "online", 00:14:52.627 "raid_level": "raid0", 00:14:52.627 "superblock": true, 00:14:52.627 "num_base_bdevs": 3, 00:14:52.627 "num_base_bdevs_discovered": 3, 00:14:52.627 "num_base_bdevs_operational": 3, 00:14:52.627 "base_bdevs_list": [ 00:14:52.627 { 00:14:52.627 "name": "NewBaseBdev", 00:14:52.627 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:52.627 "is_configured": true, 00:14:52.627 "data_offset": 2048, 00:14:52.627 "data_size": 63488 00:14:52.627 }, 00:14:52.627 { 00:14:52.627 "name": "BaseBdev2", 00:14:52.627 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:52.627 "is_configured": true, 00:14:52.627 "data_offset": 2048, 00:14:52.627 "data_size": 63488 00:14:52.627 }, 00:14:52.627 { 00:14:52.627 "name": "BaseBdev3", 00:14:52.627 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:52.627 "is_configured": true, 00:14:52.627 "data_offset": 2048, 00:14:52.627 "data_size": 63488 00:14:52.627 } 00:14:52.627 ] 00:14:52.627 } 00:14:52.627 } 00:14:52.627 }' 00:14:52.627 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:52.627 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:52.627 BaseBdev2 00:14:52.627 BaseBdev3' 00:14:52.627 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:52.627 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:52.627 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:52.886 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:52.886 "name": "NewBaseBdev", 00:14:52.886 "aliases": [ 00:14:52.886 "be01a01c-a8f9-43b7-987f-4b6851a933b7" 00:14:52.886 ], 00:14:52.886 "product_name": "Malloc disk", 00:14:52.886 "block_size": 512, 00:14:52.886 "num_blocks": 65536, 00:14:52.886 "uuid": "be01a01c-a8f9-43b7-987f-4b6851a933b7", 00:14:52.886 "assigned_rate_limits": { 00:14:52.886 "rw_ios_per_sec": 0, 00:14:52.886 "rw_mbytes_per_sec": 0, 00:14:52.886 "r_mbytes_per_sec": 0, 00:14:52.886 "w_mbytes_per_sec": 0 00:14:52.886 }, 00:14:52.886 "claimed": true, 00:14:52.886 "claim_type": "exclusive_write", 00:14:52.886 "zoned": false, 00:14:52.886 "supported_io_types": { 00:14:52.886 "read": true, 00:14:52.886 "write": true, 00:14:52.886 "unmap": true, 00:14:52.886 "flush": true, 00:14:52.886 "reset": true, 00:14:52.886 "nvme_admin": false, 00:14:52.886 "nvme_io": false, 00:14:52.886 "nvme_io_md": false, 00:14:52.886 "write_zeroes": true, 00:14:52.886 "zcopy": true, 00:14:52.886 "get_zone_info": false, 00:14:52.886 "zone_management": false, 00:14:52.886 "zone_append": false, 00:14:52.886 "compare": false, 00:14:52.886 "compare_and_write": false, 00:14:52.886 "abort": true, 00:14:52.886 "seek_hole": false, 00:14:52.886 "seek_data": false, 00:14:52.886 "copy": true, 00:14:52.886 "nvme_iov_md": false 00:14:52.886 }, 00:14:52.886 "memory_domains": [ 00:14:52.886 { 00:14:52.886 "dma_device_id": "system", 00:14:52.886 "dma_device_type": 1 00:14:52.886 }, 00:14:52.886 { 00:14:52.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.886 "dma_device_type": 2 00:14:52.886 } 00:14:52.886 ], 00:14:52.886 "driver_specific": {} 00:14:52.886 }' 00:14:52.886 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.886 10:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.886 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:52.886 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.886 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:53.145 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:53.404 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:53.404 "name": "BaseBdev2", 00:14:53.404 "aliases": [ 00:14:53.404 "f61d7844-0569-41b7-b499-66cd4616b018" 00:14:53.404 ], 00:14:53.404 "product_name": "Malloc disk", 00:14:53.404 "block_size": 512, 00:14:53.404 "num_blocks": 65536, 00:14:53.404 "uuid": "f61d7844-0569-41b7-b499-66cd4616b018", 00:14:53.404 "assigned_rate_limits": { 00:14:53.404 "rw_ios_per_sec": 0, 00:14:53.404 "rw_mbytes_per_sec": 0, 00:14:53.404 "r_mbytes_per_sec": 0, 00:14:53.404 "w_mbytes_per_sec": 0 00:14:53.404 }, 00:14:53.404 "claimed": true, 00:14:53.404 "claim_type": "exclusive_write", 00:14:53.404 "zoned": false, 00:14:53.404 "supported_io_types": { 00:14:53.404 "read": true, 00:14:53.404 "write": true, 00:14:53.404 "unmap": true, 00:14:53.404 "flush": true, 00:14:53.404 "reset": true, 00:14:53.404 "nvme_admin": false, 00:14:53.404 "nvme_io": false, 00:14:53.404 "nvme_io_md": false, 00:14:53.404 "write_zeroes": true, 00:14:53.404 "zcopy": true, 00:14:53.404 "get_zone_info": false, 00:14:53.404 "zone_management": false, 00:14:53.404 "zone_append": false, 00:14:53.404 "compare": false, 00:14:53.404 "compare_and_write": false, 00:14:53.404 "abort": true, 00:14:53.404 "seek_hole": false, 00:14:53.404 "seek_data": false, 00:14:53.404 "copy": true, 00:14:53.404 "nvme_iov_md": false 00:14:53.404 }, 00:14:53.404 "memory_domains": [ 00:14:53.404 { 00:14:53.404 "dma_device_id": "system", 00:14:53.404 "dma_device_type": 1 00:14:53.404 }, 00:14:53.404 { 00:14:53.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.404 "dma_device_type": 2 00:14:53.404 } 00:14:53.404 ], 00:14:53.404 "driver_specific": {} 00:14:53.404 }' 00:14:53.404 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.404 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.404 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:53.404 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.662 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.662 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:53.662 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.662 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.662 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.662 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.662 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.921 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.921 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:53.921 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:53.921 10:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:53.921 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:53.921 "name": "BaseBdev3", 00:14:53.921 "aliases": [ 00:14:53.921 "2ca94c0b-4963-4482-9a56-8bda7cbf2290" 00:14:53.921 ], 00:14:53.921 "product_name": "Malloc disk", 00:14:53.921 "block_size": 512, 00:14:53.921 "num_blocks": 65536, 00:14:53.921 "uuid": "2ca94c0b-4963-4482-9a56-8bda7cbf2290", 00:14:53.921 "assigned_rate_limits": { 00:14:53.921 "rw_ios_per_sec": 0, 00:14:53.921 "rw_mbytes_per_sec": 0, 00:14:53.921 "r_mbytes_per_sec": 0, 00:14:53.921 "w_mbytes_per_sec": 0 00:14:53.921 }, 00:14:53.921 "claimed": true, 00:14:53.921 "claim_type": "exclusive_write", 00:14:53.921 "zoned": false, 00:14:53.921 "supported_io_types": { 00:14:53.921 "read": true, 00:14:53.921 "write": true, 00:14:53.921 "unmap": true, 00:14:53.921 "flush": true, 00:14:53.921 "reset": true, 00:14:53.921 "nvme_admin": false, 00:14:53.921 "nvme_io": false, 00:14:53.921 "nvme_io_md": false, 00:14:53.921 "write_zeroes": true, 00:14:53.921 "zcopy": true, 00:14:53.921 "get_zone_info": false, 00:14:53.921 "zone_management": false, 00:14:53.921 "zone_append": false, 00:14:53.921 "compare": false, 00:14:53.921 "compare_and_write": false, 00:14:53.921 "abort": true, 00:14:53.921 "seek_hole": false, 00:14:53.921 "seek_data": false, 00:14:53.921 "copy": true, 00:14:53.921 "nvme_iov_md": false 00:14:53.921 }, 00:14:53.921 "memory_domains": [ 00:14:53.921 { 00:14:53.921 "dma_device_id": "system", 00:14:53.921 "dma_device_type": 1 00:14:53.921 }, 00:14:53.921 { 00:14:53.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.921 "dma_device_type": 2 00:14:53.921 } 00:14:53.921 ], 00:14:53.921 "driver_specific": {} 00:14:53.921 }' 00:14:53.921 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:54.180 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:54.180 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:54.180 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:54.180 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:54.180 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:54.180 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:54.180 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:54.439 [2024-07-15 10:22:31.583680] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:54.439 [2024-07-15 10:22:31.583709] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:54.439 [2024-07-15 10:22:31.583767] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:54.439 [2024-07-15 10:22:31.583822] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:54.439 [2024-07-15 10:22:31.583834] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11aee90 name Existed_Raid, state offline 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 499259 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 499259 ']' 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 499259 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 499259 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 499259' 00:14:54.439 killing process with pid 499259 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 499259 00:14:54.439 [2024-07-15 10:22:31.627427] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:54.439 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 499259 00:14:54.699 [2024-07-15 10:22:31.658506] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:54.699 10:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:54.699 00:14:54.699 real 0m28.591s 00:14:54.699 user 0m52.650s 00:14:54.699 sys 0m4.953s 00:14:54.699 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:54.699 10:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:54.699 ************************************ 00:14:54.699 END TEST raid_state_function_test_sb 00:14:54.699 ************************************ 00:14:54.958 10:22:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:54.958 10:22:31 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:54.958 10:22:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:54.958 10:22:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:54.958 10:22:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:54.958 ************************************ 00:14:54.958 START TEST raid_superblock_test 00:14:54.958 ************************************ 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=503582 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 503582 /var/tmp/spdk-raid.sock 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 503582 ']' 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:54.958 10:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:54.958 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:54.959 10:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:54.959 10:22:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.959 [2024-07-15 10:22:32.035181] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:14:54.959 [2024-07-15 10:22:32.035251] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid503582 ] 00:14:55.217 [2024-07-15 10:22:32.165596] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:55.217 [2024-07-15 10:22:32.262571] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:55.217 [2024-07-15 10:22:32.320745] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:55.217 [2024-07-15 10:22:32.320778] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:55.784 10:22:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:56.043 malloc1 00:14:56.043 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:56.043 [2024-07-15 10:22:33.227579] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:56.043 [2024-07-15 10:22:33.227630] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:56.043 [2024-07-15 10:22:33.227650] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2697570 00:14:56.043 [2024-07-15 10:22:33.227662] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:56.043 [2024-07-15 10:22:33.229230] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:56.043 [2024-07-15 10:22:33.229259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:56.043 pt1 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:56.301 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:56.301 malloc2 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:56.559 [2024-07-15 10:22:33.733722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:56.559 [2024-07-15 10:22:33.733772] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:56.559 [2024-07-15 10:22:33.733790] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2698970 00:14:56.559 [2024-07-15 10:22:33.733803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:56.559 [2024-07-15 10:22:33.735295] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:56.559 [2024-07-15 10:22:33.735323] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:56.559 pt2 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:56.559 10:22:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:56.818 malloc3 00:14:56.818 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:57.076 [2024-07-15 10:22:34.235740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:57.076 [2024-07-15 10:22:34.235794] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.076 [2024-07-15 10:22:34.235813] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x282f340 00:14:57.076 [2024-07-15 10:22:34.235825] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.076 [2024-07-15 10:22:34.237342] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.076 [2024-07-15 10:22:34.237371] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:57.076 pt3 00:14:57.076 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:57.076 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:57.076 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:57.335 [2024-07-15 10:22:34.484423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:57.335 [2024-07-15 10:22:34.485720] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:57.335 [2024-07-15 10:22:34.485776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:57.335 [2024-07-15 10:22:34.485924] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x268fea0 00:14:57.335 [2024-07-15 10:22:34.485943] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:57.335 [2024-07-15 10:22:34.486142] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2697240 00:14:57.335 [2024-07-15 10:22:34.486286] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x268fea0 00:14:57.335 [2024-07-15 10:22:34.486296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x268fea0 00:14:57.335 [2024-07-15 10:22:34.486398] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.335 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:57.594 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.594 "name": "raid_bdev1", 00:14:57.594 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:14:57.594 "strip_size_kb": 64, 00:14:57.594 "state": "online", 00:14:57.594 "raid_level": "raid0", 00:14:57.594 "superblock": true, 00:14:57.594 "num_base_bdevs": 3, 00:14:57.594 "num_base_bdevs_discovered": 3, 00:14:57.594 "num_base_bdevs_operational": 3, 00:14:57.594 "base_bdevs_list": [ 00:14:57.594 { 00:14:57.594 "name": "pt1", 00:14:57.594 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:57.594 "is_configured": true, 00:14:57.594 "data_offset": 2048, 00:14:57.594 "data_size": 63488 00:14:57.594 }, 00:14:57.594 { 00:14:57.594 "name": "pt2", 00:14:57.594 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:57.594 "is_configured": true, 00:14:57.594 "data_offset": 2048, 00:14:57.594 "data_size": 63488 00:14:57.594 }, 00:14:57.594 { 00:14:57.594 "name": "pt3", 00:14:57.594 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:57.594 "is_configured": true, 00:14:57.594 "data_offset": 2048, 00:14:57.594 "data_size": 63488 00:14:57.594 } 00:14:57.594 ] 00:14:57.594 }' 00:14:57.594 10:22:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.594 10:22:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:58.161 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:58.421 [2024-07-15 10:22:35.443228] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:58.421 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:58.421 "name": "raid_bdev1", 00:14:58.421 "aliases": [ 00:14:58.421 "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4" 00:14:58.421 ], 00:14:58.421 "product_name": "Raid Volume", 00:14:58.421 "block_size": 512, 00:14:58.421 "num_blocks": 190464, 00:14:58.421 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:14:58.421 "assigned_rate_limits": { 00:14:58.421 "rw_ios_per_sec": 0, 00:14:58.421 "rw_mbytes_per_sec": 0, 00:14:58.421 "r_mbytes_per_sec": 0, 00:14:58.421 "w_mbytes_per_sec": 0 00:14:58.421 }, 00:14:58.421 "claimed": false, 00:14:58.421 "zoned": false, 00:14:58.421 "supported_io_types": { 00:14:58.421 "read": true, 00:14:58.421 "write": true, 00:14:58.421 "unmap": true, 00:14:58.421 "flush": true, 00:14:58.421 "reset": true, 00:14:58.421 "nvme_admin": false, 00:14:58.421 "nvme_io": false, 00:14:58.421 "nvme_io_md": false, 00:14:58.421 "write_zeroes": true, 00:14:58.421 "zcopy": false, 00:14:58.421 "get_zone_info": false, 00:14:58.421 "zone_management": false, 00:14:58.421 "zone_append": false, 00:14:58.421 "compare": false, 00:14:58.421 "compare_and_write": false, 00:14:58.421 "abort": false, 00:14:58.421 "seek_hole": false, 00:14:58.421 "seek_data": false, 00:14:58.421 "copy": false, 00:14:58.421 "nvme_iov_md": false 00:14:58.421 }, 00:14:58.421 "memory_domains": [ 00:14:58.421 { 00:14:58.421 "dma_device_id": "system", 00:14:58.421 "dma_device_type": 1 00:14:58.421 }, 00:14:58.421 { 00:14:58.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.421 "dma_device_type": 2 00:14:58.421 }, 00:14:58.421 { 00:14:58.421 "dma_device_id": "system", 00:14:58.421 "dma_device_type": 1 00:14:58.421 }, 00:14:58.421 { 00:14:58.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.421 "dma_device_type": 2 00:14:58.421 }, 00:14:58.421 { 00:14:58.421 "dma_device_id": "system", 00:14:58.421 "dma_device_type": 1 00:14:58.421 }, 00:14:58.421 { 00:14:58.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.421 "dma_device_type": 2 00:14:58.421 } 00:14:58.421 ], 00:14:58.421 "driver_specific": { 00:14:58.421 "raid": { 00:14:58.421 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:14:58.421 "strip_size_kb": 64, 00:14:58.421 "state": "online", 00:14:58.421 "raid_level": "raid0", 00:14:58.421 "superblock": true, 00:14:58.421 "num_base_bdevs": 3, 00:14:58.421 "num_base_bdevs_discovered": 3, 00:14:58.421 "num_base_bdevs_operational": 3, 00:14:58.421 "base_bdevs_list": [ 00:14:58.421 { 00:14:58.421 "name": "pt1", 00:14:58.421 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:58.421 "is_configured": true, 00:14:58.421 "data_offset": 2048, 00:14:58.421 "data_size": 63488 00:14:58.421 }, 00:14:58.421 { 00:14:58.421 "name": "pt2", 00:14:58.421 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:58.421 "is_configured": true, 00:14:58.421 "data_offset": 2048, 00:14:58.421 "data_size": 63488 00:14:58.421 }, 00:14:58.421 { 00:14:58.421 "name": "pt3", 00:14:58.421 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:58.421 "is_configured": true, 00:14:58.421 "data_offset": 2048, 00:14:58.421 "data_size": 63488 00:14:58.421 } 00:14:58.421 ] 00:14:58.421 } 00:14:58.421 } 00:14:58.421 }' 00:14:58.421 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:58.421 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:58.421 pt2 00:14:58.421 pt3' 00:14:58.421 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.421 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:58.421 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.680 "name": "pt1", 00:14:58.680 "aliases": [ 00:14:58.680 "00000000-0000-0000-0000-000000000001" 00:14:58.680 ], 00:14:58.680 "product_name": "passthru", 00:14:58.680 "block_size": 512, 00:14:58.680 "num_blocks": 65536, 00:14:58.680 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:58.680 "assigned_rate_limits": { 00:14:58.680 "rw_ios_per_sec": 0, 00:14:58.680 "rw_mbytes_per_sec": 0, 00:14:58.680 "r_mbytes_per_sec": 0, 00:14:58.680 "w_mbytes_per_sec": 0 00:14:58.680 }, 00:14:58.680 "claimed": true, 00:14:58.680 "claim_type": "exclusive_write", 00:14:58.680 "zoned": false, 00:14:58.680 "supported_io_types": { 00:14:58.680 "read": true, 00:14:58.680 "write": true, 00:14:58.680 "unmap": true, 00:14:58.680 "flush": true, 00:14:58.680 "reset": true, 00:14:58.680 "nvme_admin": false, 00:14:58.680 "nvme_io": false, 00:14:58.680 "nvme_io_md": false, 00:14:58.680 "write_zeroes": true, 00:14:58.680 "zcopy": true, 00:14:58.680 "get_zone_info": false, 00:14:58.680 "zone_management": false, 00:14:58.680 "zone_append": false, 00:14:58.680 "compare": false, 00:14:58.680 "compare_and_write": false, 00:14:58.680 "abort": true, 00:14:58.680 "seek_hole": false, 00:14:58.680 "seek_data": false, 00:14:58.680 "copy": true, 00:14:58.680 "nvme_iov_md": false 00:14:58.680 }, 00:14:58.680 "memory_domains": [ 00:14:58.680 { 00:14:58.680 "dma_device_id": "system", 00:14:58.680 "dma_device_type": 1 00:14:58.680 }, 00:14:58.680 { 00:14:58.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.680 "dma_device_type": 2 00:14:58.680 } 00:14:58.680 ], 00:14:58.680 "driver_specific": { 00:14:58.680 "passthru": { 00:14:58.680 "name": "pt1", 00:14:58.680 "base_bdev_name": "malloc1" 00:14:58.680 } 00:14:58.680 } 00:14:58.680 }' 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.680 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.939 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.939 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.939 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.939 10:22:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.939 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.939 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.939 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:58.939 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:59.198 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:59.198 "name": "pt2", 00:14:59.198 "aliases": [ 00:14:59.198 "00000000-0000-0000-0000-000000000002" 00:14:59.198 ], 00:14:59.198 "product_name": "passthru", 00:14:59.198 "block_size": 512, 00:14:59.198 "num_blocks": 65536, 00:14:59.198 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:59.198 "assigned_rate_limits": { 00:14:59.198 "rw_ios_per_sec": 0, 00:14:59.198 "rw_mbytes_per_sec": 0, 00:14:59.198 "r_mbytes_per_sec": 0, 00:14:59.198 "w_mbytes_per_sec": 0 00:14:59.198 }, 00:14:59.198 "claimed": true, 00:14:59.198 "claim_type": "exclusive_write", 00:14:59.198 "zoned": false, 00:14:59.198 "supported_io_types": { 00:14:59.198 "read": true, 00:14:59.198 "write": true, 00:14:59.198 "unmap": true, 00:14:59.198 "flush": true, 00:14:59.198 "reset": true, 00:14:59.198 "nvme_admin": false, 00:14:59.198 "nvme_io": false, 00:14:59.198 "nvme_io_md": false, 00:14:59.198 "write_zeroes": true, 00:14:59.198 "zcopy": true, 00:14:59.198 "get_zone_info": false, 00:14:59.198 "zone_management": false, 00:14:59.198 "zone_append": false, 00:14:59.198 "compare": false, 00:14:59.198 "compare_and_write": false, 00:14:59.198 "abort": true, 00:14:59.198 "seek_hole": false, 00:14:59.198 "seek_data": false, 00:14:59.198 "copy": true, 00:14:59.198 "nvme_iov_md": false 00:14:59.198 }, 00:14:59.198 "memory_domains": [ 00:14:59.198 { 00:14:59.198 "dma_device_id": "system", 00:14:59.198 "dma_device_type": 1 00:14:59.198 }, 00:14:59.198 { 00:14:59.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.198 "dma_device_type": 2 00:14:59.198 } 00:14:59.198 ], 00:14:59.198 "driver_specific": { 00:14:59.198 "passthru": { 00:14:59.198 "name": "pt2", 00:14:59.198 "base_bdev_name": "malloc2" 00:14:59.198 } 00:14:59.198 } 00:14:59.198 }' 00:14:59.198 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.198 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.198 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.198 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.198 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:59.457 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:59.715 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:59.716 "name": "pt3", 00:14:59.716 "aliases": [ 00:14:59.716 "00000000-0000-0000-0000-000000000003" 00:14:59.716 ], 00:14:59.716 "product_name": "passthru", 00:14:59.716 "block_size": 512, 00:14:59.716 "num_blocks": 65536, 00:14:59.716 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:59.716 "assigned_rate_limits": { 00:14:59.716 "rw_ios_per_sec": 0, 00:14:59.716 "rw_mbytes_per_sec": 0, 00:14:59.716 "r_mbytes_per_sec": 0, 00:14:59.716 "w_mbytes_per_sec": 0 00:14:59.716 }, 00:14:59.716 "claimed": true, 00:14:59.716 "claim_type": "exclusive_write", 00:14:59.716 "zoned": false, 00:14:59.716 "supported_io_types": { 00:14:59.716 "read": true, 00:14:59.716 "write": true, 00:14:59.716 "unmap": true, 00:14:59.716 "flush": true, 00:14:59.716 "reset": true, 00:14:59.716 "nvme_admin": false, 00:14:59.716 "nvme_io": false, 00:14:59.716 "nvme_io_md": false, 00:14:59.716 "write_zeroes": true, 00:14:59.716 "zcopy": true, 00:14:59.716 "get_zone_info": false, 00:14:59.716 "zone_management": false, 00:14:59.716 "zone_append": false, 00:14:59.716 "compare": false, 00:14:59.716 "compare_and_write": false, 00:14:59.716 "abort": true, 00:14:59.716 "seek_hole": false, 00:14:59.716 "seek_data": false, 00:14:59.716 "copy": true, 00:14:59.716 "nvme_iov_md": false 00:14:59.716 }, 00:14:59.716 "memory_domains": [ 00:14:59.716 { 00:14:59.716 "dma_device_id": "system", 00:14:59.716 "dma_device_type": 1 00:14:59.716 }, 00:14:59.716 { 00:14:59.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.716 "dma_device_type": 2 00:14:59.716 } 00:14:59.716 ], 00:14:59.716 "driver_specific": { 00:14:59.716 "passthru": { 00:14:59.716 "name": "pt3", 00:14:59.716 "base_bdev_name": "malloc3" 00:14:59.716 } 00:14:59.716 } 00:14:59.716 }' 00:14:59.716 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.716 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.716 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.716 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.974 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.974 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.974 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.974 10:22:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.974 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.974 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.974 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.974 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.974 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:59.974 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:00.232 [2024-07-15 10:22:37.356416] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:00.233 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8a743934-d6a6-4b4f-937e-bd6ffc4c89c4 00:15:00.233 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8a743934-d6a6-4b4f-937e-bd6ffc4c89c4 ']' 00:15:00.233 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:00.491 [2024-07-15 10:22:37.524560] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:00.491 [2024-07-15 10:22:37.524586] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:00.491 [2024-07-15 10:22:37.524643] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:00.491 [2024-07-15 10:22:37.524696] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:00.491 [2024-07-15 10:22:37.524709] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x268fea0 name raid_bdev1, state offline 00:15:00.491 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.491 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:00.750 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:00.750 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:00.750 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:00.750 10:22:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:01.008 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:01.008 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:01.267 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:01.267 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:01.267 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:01.267 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:01.525 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:01.785 [2024-07-15 10:22:38.795883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:01.785 [2024-07-15 10:22:38.797240] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:01.785 [2024-07-15 10:22:38.797283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:01.785 [2024-07-15 10:22:38.797328] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:01.785 [2024-07-15 10:22:38.797369] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:01.785 [2024-07-15 10:22:38.797392] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:01.785 [2024-07-15 10:22:38.797409] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:01.785 [2024-07-15 10:22:38.797420] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x283aff0 name raid_bdev1, state configuring 00:15:01.785 request: 00:15:01.785 { 00:15:01.785 "name": "raid_bdev1", 00:15:01.785 "raid_level": "raid0", 00:15:01.785 "base_bdevs": [ 00:15:01.785 "malloc1", 00:15:01.785 "malloc2", 00:15:01.785 "malloc3" 00:15:01.785 ], 00:15:01.785 "strip_size_kb": 64, 00:15:01.785 "superblock": false, 00:15:01.785 "method": "bdev_raid_create", 00:15:01.785 "req_id": 1 00:15:01.785 } 00:15:01.785 Got JSON-RPC error response 00:15:01.785 response: 00:15:01.785 { 00:15:01.785 "code": -17, 00:15:01.785 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:01.785 } 00:15:01.785 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:01.785 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:01.785 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:01.785 10:22:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:01.785 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.785 10:22:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:02.044 [2024-07-15 10:22:39.212932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:02.044 [2024-07-15 10:22:39.212983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.044 [2024-07-15 10:22:39.213013] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26977a0 00:15:02.044 [2024-07-15 10:22:39.213026] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.044 [2024-07-15 10:22:39.214664] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.044 [2024-07-15 10:22:39.214692] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:02.044 [2024-07-15 10:22:39.214767] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:02.044 [2024-07-15 10:22:39.214793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:02.044 pt1 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.044 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:02.303 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.303 "name": "raid_bdev1", 00:15:02.303 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:15:02.303 "strip_size_kb": 64, 00:15:02.303 "state": "configuring", 00:15:02.303 "raid_level": "raid0", 00:15:02.303 "superblock": true, 00:15:02.303 "num_base_bdevs": 3, 00:15:02.303 "num_base_bdevs_discovered": 1, 00:15:02.303 "num_base_bdevs_operational": 3, 00:15:02.303 "base_bdevs_list": [ 00:15:02.303 { 00:15:02.303 "name": "pt1", 00:15:02.303 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:02.303 "is_configured": true, 00:15:02.303 "data_offset": 2048, 00:15:02.303 "data_size": 63488 00:15:02.303 }, 00:15:02.303 { 00:15:02.303 "name": null, 00:15:02.303 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:02.303 "is_configured": false, 00:15:02.303 "data_offset": 2048, 00:15:02.303 "data_size": 63488 00:15:02.303 }, 00:15:02.303 { 00:15:02.303 "name": null, 00:15:02.303 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:02.303 "is_configured": false, 00:15:02.303 "data_offset": 2048, 00:15:02.303 "data_size": 63488 00:15:02.303 } 00:15:02.303 ] 00:15:02.303 }' 00:15:02.303 10:22:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.303 10:22:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.871 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:02.871 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:03.192 [2024-07-15 10:22:40.259824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:03.192 [2024-07-15 10:22:40.259881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:03.192 [2024-07-15 10:22:40.259902] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268ec70 00:15:03.192 [2024-07-15 10:22:40.259915] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:03.192 [2024-07-15 10:22:40.260283] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:03.192 [2024-07-15 10:22:40.260301] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:03.192 [2024-07-15 10:22:40.260371] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:03.192 [2024-07-15 10:22:40.260397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:03.192 pt2 00:15:03.192 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:03.450 [2024-07-15 10:22:40.508504] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.450 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.709 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.709 "name": "raid_bdev1", 00:15:03.709 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:15:03.709 "strip_size_kb": 64, 00:15:03.709 "state": "configuring", 00:15:03.709 "raid_level": "raid0", 00:15:03.709 "superblock": true, 00:15:03.709 "num_base_bdevs": 3, 00:15:03.709 "num_base_bdevs_discovered": 1, 00:15:03.709 "num_base_bdevs_operational": 3, 00:15:03.709 "base_bdevs_list": [ 00:15:03.709 { 00:15:03.709 "name": "pt1", 00:15:03.709 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:03.709 "is_configured": true, 00:15:03.709 "data_offset": 2048, 00:15:03.709 "data_size": 63488 00:15:03.709 }, 00:15:03.709 { 00:15:03.709 "name": null, 00:15:03.709 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:03.709 "is_configured": false, 00:15:03.709 "data_offset": 2048, 00:15:03.709 "data_size": 63488 00:15:03.709 }, 00:15:03.709 { 00:15:03.709 "name": null, 00:15:03.709 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:03.709 "is_configured": false, 00:15:03.709 "data_offset": 2048, 00:15:03.709 "data_size": 63488 00:15:03.709 } 00:15:03.709 ] 00:15:03.709 }' 00:15:03.709 10:22:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.709 10:22:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.275 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:04.275 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:04.275 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:04.533 [2024-07-15 10:22:41.535202] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:04.533 [2024-07-15 10:22:41.535258] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:04.533 [2024-07-15 10:22:41.535276] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x282ffa0 00:15:04.533 [2024-07-15 10:22:41.535289] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:04.533 [2024-07-15 10:22:41.535643] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:04.533 [2024-07-15 10:22:41.535660] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:04.533 [2024-07-15 10:22:41.535725] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:04.533 [2024-07-15 10:22:41.535744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:04.533 pt2 00:15:04.533 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:04.533 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:04.533 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:04.791 [2024-07-15 10:22:41.779848] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:04.791 [2024-07-15 10:22:41.779890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:04.791 [2024-07-15 10:22:41.779909] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2830b30 00:15:04.791 [2024-07-15 10:22:41.779922] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:04.791 [2024-07-15 10:22:41.780236] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:04.791 [2024-07-15 10:22:41.780253] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:04.791 [2024-07-15 10:22:41.780310] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:04.791 [2024-07-15 10:22:41.780329] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:04.791 [2024-07-15 10:22:41.780433] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2831c00 00:15:04.791 [2024-07-15 10:22:41.780444] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:04.791 [2024-07-15 10:22:41.780609] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x283a9b0 00:15:04.791 [2024-07-15 10:22:41.780730] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2831c00 00:15:04.791 [2024-07-15 10:22:41.780740] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2831c00 00:15:04.791 [2024-07-15 10:22:41.780836] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.791 pt3 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.791 10:22:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:05.049 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.049 "name": "raid_bdev1", 00:15:05.049 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:15:05.049 "strip_size_kb": 64, 00:15:05.049 "state": "online", 00:15:05.049 "raid_level": "raid0", 00:15:05.049 "superblock": true, 00:15:05.049 "num_base_bdevs": 3, 00:15:05.049 "num_base_bdevs_discovered": 3, 00:15:05.049 "num_base_bdevs_operational": 3, 00:15:05.049 "base_bdevs_list": [ 00:15:05.049 { 00:15:05.049 "name": "pt1", 00:15:05.049 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:05.049 "is_configured": true, 00:15:05.049 "data_offset": 2048, 00:15:05.049 "data_size": 63488 00:15:05.050 }, 00:15:05.050 { 00:15:05.050 "name": "pt2", 00:15:05.050 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:05.050 "is_configured": true, 00:15:05.050 "data_offset": 2048, 00:15:05.050 "data_size": 63488 00:15:05.050 }, 00:15:05.050 { 00:15:05.050 "name": "pt3", 00:15:05.050 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:05.050 "is_configured": true, 00:15:05.050 "data_offset": 2048, 00:15:05.050 "data_size": 63488 00:15:05.050 } 00:15:05.050 ] 00:15:05.050 }' 00:15:05.050 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.050 10:22:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:05.616 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:05.876 [2024-07-15 10:22:42.850981] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:05.876 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:05.876 "name": "raid_bdev1", 00:15:05.876 "aliases": [ 00:15:05.876 "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4" 00:15:05.876 ], 00:15:05.876 "product_name": "Raid Volume", 00:15:05.876 "block_size": 512, 00:15:05.876 "num_blocks": 190464, 00:15:05.876 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:15:05.876 "assigned_rate_limits": { 00:15:05.876 "rw_ios_per_sec": 0, 00:15:05.876 "rw_mbytes_per_sec": 0, 00:15:05.876 "r_mbytes_per_sec": 0, 00:15:05.876 "w_mbytes_per_sec": 0 00:15:05.876 }, 00:15:05.876 "claimed": false, 00:15:05.876 "zoned": false, 00:15:05.876 "supported_io_types": { 00:15:05.876 "read": true, 00:15:05.876 "write": true, 00:15:05.876 "unmap": true, 00:15:05.876 "flush": true, 00:15:05.876 "reset": true, 00:15:05.876 "nvme_admin": false, 00:15:05.876 "nvme_io": false, 00:15:05.876 "nvme_io_md": false, 00:15:05.876 "write_zeroes": true, 00:15:05.876 "zcopy": false, 00:15:05.876 "get_zone_info": false, 00:15:05.876 "zone_management": false, 00:15:05.876 "zone_append": false, 00:15:05.876 "compare": false, 00:15:05.876 "compare_and_write": false, 00:15:05.876 "abort": false, 00:15:05.876 "seek_hole": false, 00:15:05.876 "seek_data": false, 00:15:05.876 "copy": false, 00:15:05.876 "nvme_iov_md": false 00:15:05.876 }, 00:15:05.876 "memory_domains": [ 00:15:05.876 { 00:15:05.876 "dma_device_id": "system", 00:15:05.876 "dma_device_type": 1 00:15:05.876 }, 00:15:05.876 { 00:15:05.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.876 "dma_device_type": 2 00:15:05.876 }, 00:15:05.876 { 00:15:05.876 "dma_device_id": "system", 00:15:05.876 "dma_device_type": 1 00:15:05.876 }, 00:15:05.876 { 00:15:05.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.876 "dma_device_type": 2 00:15:05.876 }, 00:15:05.876 { 00:15:05.876 "dma_device_id": "system", 00:15:05.876 "dma_device_type": 1 00:15:05.876 }, 00:15:05.876 { 00:15:05.876 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.876 "dma_device_type": 2 00:15:05.876 } 00:15:05.876 ], 00:15:05.876 "driver_specific": { 00:15:05.876 "raid": { 00:15:05.876 "uuid": "8a743934-d6a6-4b4f-937e-bd6ffc4c89c4", 00:15:05.876 "strip_size_kb": 64, 00:15:05.876 "state": "online", 00:15:05.876 "raid_level": "raid0", 00:15:05.876 "superblock": true, 00:15:05.876 "num_base_bdevs": 3, 00:15:05.876 "num_base_bdevs_discovered": 3, 00:15:05.876 "num_base_bdevs_operational": 3, 00:15:05.876 "base_bdevs_list": [ 00:15:05.876 { 00:15:05.876 "name": "pt1", 00:15:05.876 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:05.876 "is_configured": true, 00:15:05.876 "data_offset": 2048, 00:15:05.876 "data_size": 63488 00:15:05.876 }, 00:15:05.876 { 00:15:05.877 "name": "pt2", 00:15:05.877 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:05.877 "is_configured": true, 00:15:05.877 "data_offset": 2048, 00:15:05.877 "data_size": 63488 00:15:05.877 }, 00:15:05.877 { 00:15:05.877 "name": "pt3", 00:15:05.877 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:05.877 "is_configured": true, 00:15:05.877 "data_offset": 2048, 00:15:05.877 "data_size": 63488 00:15:05.877 } 00:15:05.877 ] 00:15:05.877 } 00:15:05.877 } 00:15:05.877 }' 00:15:05.877 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:05.877 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:05.877 pt2 00:15:05.877 pt3' 00:15:05.877 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:05.877 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:05.877 10:22:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:06.136 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:06.136 "name": "pt1", 00:15:06.136 "aliases": [ 00:15:06.136 "00000000-0000-0000-0000-000000000001" 00:15:06.136 ], 00:15:06.136 "product_name": "passthru", 00:15:06.136 "block_size": 512, 00:15:06.136 "num_blocks": 65536, 00:15:06.136 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:06.136 "assigned_rate_limits": { 00:15:06.136 "rw_ios_per_sec": 0, 00:15:06.136 "rw_mbytes_per_sec": 0, 00:15:06.136 "r_mbytes_per_sec": 0, 00:15:06.136 "w_mbytes_per_sec": 0 00:15:06.136 }, 00:15:06.136 "claimed": true, 00:15:06.136 "claim_type": "exclusive_write", 00:15:06.136 "zoned": false, 00:15:06.136 "supported_io_types": { 00:15:06.136 "read": true, 00:15:06.136 "write": true, 00:15:06.136 "unmap": true, 00:15:06.136 "flush": true, 00:15:06.136 "reset": true, 00:15:06.136 "nvme_admin": false, 00:15:06.136 "nvme_io": false, 00:15:06.136 "nvme_io_md": false, 00:15:06.136 "write_zeroes": true, 00:15:06.136 "zcopy": true, 00:15:06.136 "get_zone_info": false, 00:15:06.136 "zone_management": false, 00:15:06.136 "zone_append": false, 00:15:06.136 "compare": false, 00:15:06.136 "compare_and_write": false, 00:15:06.136 "abort": true, 00:15:06.136 "seek_hole": false, 00:15:06.136 "seek_data": false, 00:15:06.136 "copy": true, 00:15:06.136 "nvme_iov_md": false 00:15:06.136 }, 00:15:06.136 "memory_domains": [ 00:15:06.136 { 00:15:06.136 "dma_device_id": "system", 00:15:06.136 "dma_device_type": 1 00:15:06.136 }, 00:15:06.136 { 00:15:06.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.136 "dma_device_type": 2 00:15:06.136 } 00:15:06.136 ], 00:15:06.136 "driver_specific": { 00:15:06.136 "passthru": { 00:15:06.136 "name": "pt1", 00:15:06.136 "base_bdev_name": "malloc1" 00:15:06.136 } 00:15:06.136 } 00:15:06.136 }' 00:15:06.136 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.136 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.136 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:06.136 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.136 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.393 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:06.393 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.393 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.394 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:06.394 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.394 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.394 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:06.394 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:06.394 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:06.394 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:06.652 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:06.652 "name": "pt2", 00:15:06.652 "aliases": [ 00:15:06.652 "00000000-0000-0000-0000-000000000002" 00:15:06.652 ], 00:15:06.652 "product_name": "passthru", 00:15:06.652 "block_size": 512, 00:15:06.652 "num_blocks": 65536, 00:15:06.652 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:06.652 "assigned_rate_limits": { 00:15:06.652 "rw_ios_per_sec": 0, 00:15:06.652 "rw_mbytes_per_sec": 0, 00:15:06.652 "r_mbytes_per_sec": 0, 00:15:06.652 "w_mbytes_per_sec": 0 00:15:06.652 }, 00:15:06.652 "claimed": true, 00:15:06.652 "claim_type": "exclusive_write", 00:15:06.652 "zoned": false, 00:15:06.652 "supported_io_types": { 00:15:06.652 "read": true, 00:15:06.652 "write": true, 00:15:06.652 "unmap": true, 00:15:06.652 "flush": true, 00:15:06.652 "reset": true, 00:15:06.652 "nvme_admin": false, 00:15:06.652 "nvme_io": false, 00:15:06.652 "nvme_io_md": false, 00:15:06.652 "write_zeroes": true, 00:15:06.652 "zcopy": true, 00:15:06.652 "get_zone_info": false, 00:15:06.652 "zone_management": false, 00:15:06.652 "zone_append": false, 00:15:06.652 "compare": false, 00:15:06.652 "compare_and_write": false, 00:15:06.652 "abort": true, 00:15:06.652 "seek_hole": false, 00:15:06.652 "seek_data": false, 00:15:06.652 "copy": true, 00:15:06.652 "nvme_iov_md": false 00:15:06.652 }, 00:15:06.652 "memory_domains": [ 00:15:06.652 { 00:15:06.652 "dma_device_id": "system", 00:15:06.652 "dma_device_type": 1 00:15:06.652 }, 00:15:06.652 { 00:15:06.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.652 "dma_device_type": 2 00:15:06.652 } 00:15:06.652 ], 00:15:06.652 "driver_specific": { 00:15:06.652 "passthru": { 00:15:06.652 "name": "pt2", 00:15:06.652 "base_bdev_name": "malloc2" 00:15:06.652 } 00:15:06.652 } 00:15:06.652 }' 00:15:06.652 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.652 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.910 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:06.910 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.910 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.910 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:06.910 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.910 10:22:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.910 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:06.910 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.910 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.910 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:06.910 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:06.910 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:06.910 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:07.169 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:07.169 "name": "pt3", 00:15:07.169 "aliases": [ 00:15:07.169 "00000000-0000-0000-0000-000000000003" 00:15:07.169 ], 00:15:07.169 "product_name": "passthru", 00:15:07.169 "block_size": 512, 00:15:07.169 "num_blocks": 65536, 00:15:07.169 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:07.169 "assigned_rate_limits": { 00:15:07.169 "rw_ios_per_sec": 0, 00:15:07.169 "rw_mbytes_per_sec": 0, 00:15:07.169 "r_mbytes_per_sec": 0, 00:15:07.169 "w_mbytes_per_sec": 0 00:15:07.169 }, 00:15:07.169 "claimed": true, 00:15:07.169 "claim_type": "exclusive_write", 00:15:07.169 "zoned": false, 00:15:07.169 "supported_io_types": { 00:15:07.169 "read": true, 00:15:07.169 "write": true, 00:15:07.169 "unmap": true, 00:15:07.169 "flush": true, 00:15:07.169 "reset": true, 00:15:07.169 "nvme_admin": false, 00:15:07.169 "nvme_io": false, 00:15:07.169 "nvme_io_md": false, 00:15:07.169 "write_zeroes": true, 00:15:07.169 "zcopy": true, 00:15:07.169 "get_zone_info": false, 00:15:07.169 "zone_management": false, 00:15:07.169 "zone_append": false, 00:15:07.169 "compare": false, 00:15:07.169 "compare_and_write": false, 00:15:07.169 "abort": true, 00:15:07.169 "seek_hole": false, 00:15:07.169 "seek_data": false, 00:15:07.169 "copy": true, 00:15:07.169 "nvme_iov_md": false 00:15:07.169 }, 00:15:07.169 "memory_domains": [ 00:15:07.169 { 00:15:07.169 "dma_device_id": "system", 00:15:07.169 "dma_device_type": 1 00:15:07.169 }, 00:15:07.169 { 00:15:07.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.169 "dma_device_type": 2 00:15:07.169 } 00:15:07.169 ], 00:15:07.169 "driver_specific": { 00:15:07.169 "passthru": { 00:15:07.169 "name": "pt3", 00:15:07.169 "base_bdev_name": "malloc3" 00:15:07.169 } 00:15:07.169 } 00:15:07.169 }' 00:15:07.169 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:07.427 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:07.685 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:07.685 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:07.685 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:07.685 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:07.944 [2024-07-15 10:22:44.924507] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8a743934-d6a6-4b4f-937e-bd6ffc4c89c4 '!=' 8a743934-d6a6-4b4f-937e-bd6ffc4c89c4 ']' 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 503582 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 503582 ']' 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 503582 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 503582 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 503582' 00:15:07.944 killing process with pid 503582 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 503582 00:15:07.944 [2024-07-15 10:22:44.993850] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:07.944 10:22:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 503582 00:15:07.944 [2024-07-15 10:22:44.993909] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:07.944 [2024-07-15 10:22:44.993968] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:07.944 [2024-07-15 10:22:44.993981] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2831c00 name raid_bdev1, state offline 00:15:07.944 [2024-07-15 10:22:45.025127] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:08.203 10:22:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:08.203 00:15:08.203 real 0m13.275s 00:15:08.203 user 0m23.842s 00:15:08.203 sys 0m2.379s 00:15:08.203 10:22:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:08.203 10:22:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.203 ************************************ 00:15:08.203 END TEST raid_superblock_test 00:15:08.203 ************************************ 00:15:08.203 10:22:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:08.203 10:22:45 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:15:08.203 10:22:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:08.203 10:22:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:08.203 10:22:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:08.203 ************************************ 00:15:08.203 START TEST raid_read_error_test 00:15:08.203 ************************************ 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.sAH7hcPypQ 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=505700 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 505700 /var/tmp/spdk-raid.sock 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 505700 ']' 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:08.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.203 10:22:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:08.203 [2024-07-15 10:22:45.399622] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:15:08.203 [2024-07-15 10:22:45.399689] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid505700 ] 00:15:08.460 [2024-07-15 10:22:45.527733] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:08.461 [2024-07-15 10:22:45.629794] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:08.718 [2024-07-15 10:22:45.690760] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:08.718 [2024-07-15 10:22:45.690797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:09.651 10:22:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:09.651 10:22:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:09.651 10:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:09.651 10:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:09.651 BaseBdev1_malloc 00:15:09.908 10:22:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:09.908 true 00:15:09.908 10:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:10.166 [2024-07-15 10:22:47.319715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:10.166 [2024-07-15 10:22:47.319764] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:10.166 [2024-07-15 10:22:47.319785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb980d0 00:15:10.166 [2024-07-15 10:22:47.319798] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:10.166 [2024-07-15 10:22:47.321665] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:10.166 [2024-07-15 10:22:47.321696] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:10.166 BaseBdev1 00:15:10.166 10:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:10.166 10:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:10.732 BaseBdev2_malloc 00:15:10.732 10:22:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:10.990 true 00:15:10.990 10:22:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:11.558 [2024-07-15 10:22:48.576930] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:11.558 [2024-07-15 10:22:48.576975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:11.558 [2024-07-15 10:22:48.576997] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb9c910 00:15:11.558 [2024-07-15 10:22:48.577010] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:11.558 [2024-07-15 10:22:48.578588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:11.558 [2024-07-15 10:22:48.578616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:11.558 BaseBdev2 00:15:11.558 10:22:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:11.558 10:22:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:12.125 BaseBdev3_malloc 00:15:12.125 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:12.384 true 00:15:12.384 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:12.643 [2024-07-15 10:22:49.601088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:12.643 [2024-07-15 10:22:49.601136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:12.643 [2024-07-15 10:22:49.601156] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb9ebd0 00:15:12.643 [2024-07-15 10:22:49.601168] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:12.643 [2024-07-15 10:22:49.602798] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:12.643 [2024-07-15 10:22:49.602825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:12.643 BaseBdev3 00:15:12.643 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:12.902 [2024-07-15 10:22:49.845767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:12.902 [2024-07-15 10:22:49.847152] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:12.902 [2024-07-15 10:22:49.847224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:12.902 [2024-07-15 10:22:49.847430] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xba0280 00:15:12.902 [2024-07-15 10:22:49.847442] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:12.902 [2024-07-15 10:22:49.847646] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb9fe20 00:15:12.902 [2024-07-15 10:22:49.847793] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xba0280 00:15:12.902 [2024-07-15 10:22:49.847803] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xba0280 00:15:12.902 [2024-07-15 10:22:49.847911] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.902 10:22:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:13.161 10:22:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.161 "name": "raid_bdev1", 00:15:13.161 "uuid": "ca43b6ee-105e-4403-8b15-bf466a0fda57", 00:15:13.161 "strip_size_kb": 64, 00:15:13.161 "state": "online", 00:15:13.161 "raid_level": "raid0", 00:15:13.161 "superblock": true, 00:15:13.161 "num_base_bdevs": 3, 00:15:13.161 "num_base_bdevs_discovered": 3, 00:15:13.161 "num_base_bdevs_operational": 3, 00:15:13.161 "base_bdevs_list": [ 00:15:13.161 { 00:15:13.161 "name": "BaseBdev1", 00:15:13.161 "uuid": "736339ea-09b3-514d-b98a-5a2e8cc8d910", 00:15:13.161 "is_configured": true, 00:15:13.161 "data_offset": 2048, 00:15:13.161 "data_size": 63488 00:15:13.161 }, 00:15:13.161 { 00:15:13.161 "name": "BaseBdev2", 00:15:13.161 "uuid": "6d54a362-a465-5838-a99c-9e7aacc1f2c6", 00:15:13.161 "is_configured": true, 00:15:13.161 "data_offset": 2048, 00:15:13.161 "data_size": 63488 00:15:13.161 }, 00:15:13.161 { 00:15:13.161 "name": "BaseBdev3", 00:15:13.161 "uuid": "4525590b-cc94-5541-a987-e932d26b12af", 00:15:13.161 "is_configured": true, 00:15:13.161 "data_offset": 2048, 00:15:13.161 "data_size": 63488 00:15:13.161 } 00:15:13.161 ] 00:15:13.161 }' 00:15:13.161 10:22:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.161 10:22:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.729 10:22:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:13.729 10:22:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:13.729 [2024-07-15 10:22:50.804597] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9ee5b0 00:15:14.662 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.921 10:22:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:15.195 10:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:15.195 "name": "raid_bdev1", 00:15:15.195 "uuid": "ca43b6ee-105e-4403-8b15-bf466a0fda57", 00:15:15.195 "strip_size_kb": 64, 00:15:15.195 "state": "online", 00:15:15.195 "raid_level": "raid0", 00:15:15.195 "superblock": true, 00:15:15.195 "num_base_bdevs": 3, 00:15:15.195 "num_base_bdevs_discovered": 3, 00:15:15.195 "num_base_bdevs_operational": 3, 00:15:15.195 "base_bdevs_list": [ 00:15:15.195 { 00:15:15.195 "name": "BaseBdev1", 00:15:15.195 "uuid": "736339ea-09b3-514d-b98a-5a2e8cc8d910", 00:15:15.195 "is_configured": true, 00:15:15.195 "data_offset": 2048, 00:15:15.195 "data_size": 63488 00:15:15.195 }, 00:15:15.195 { 00:15:15.195 "name": "BaseBdev2", 00:15:15.195 "uuid": "6d54a362-a465-5838-a99c-9e7aacc1f2c6", 00:15:15.195 "is_configured": true, 00:15:15.195 "data_offset": 2048, 00:15:15.195 "data_size": 63488 00:15:15.195 }, 00:15:15.195 { 00:15:15.195 "name": "BaseBdev3", 00:15:15.195 "uuid": "4525590b-cc94-5541-a987-e932d26b12af", 00:15:15.195 "is_configured": true, 00:15:15.195 "data_offset": 2048, 00:15:15.195 "data_size": 63488 00:15:15.195 } 00:15:15.195 ] 00:15:15.195 }' 00:15:15.195 10:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:15.195 10:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.761 10:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:16.019 [2024-07-15 10:22:52.973727] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:16.019 [2024-07-15 10:22:52.973763] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:16.019 [2024-07-15 10:22:52.976919] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:16.019 [2024-07-15 10:22:52.976961] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:16.019 [2024-07-15 10:22:52.976995] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:16.019 [2024-07-15 10:22:52.977007] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xba0280 name raid_bdev1, state offline 00:15:16.019 0 00:15:16.019 10:22:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 505700 00:15:16.019 10:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 505700 ']' 00:15:16.019 10:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 505700 00:15:16.019 10:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:16.019 10:22:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:16.019 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 505700 00:15:16.019 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:16.019 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:16.019 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 505700' 00:15:16.019 killing process with pid 505700 00:15:16.019 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 505700 00:15:16.019 [2024-07-15 10:22:53.039982] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:16.019 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 505700 00:15:16.019 [2024-07-15 10:22:53.061193] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.sAH7hcPypQ 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:15:16.276 00:15:16.276 real 0m7.976s 00:15:16.276 user 0m12.874s 00:15:16.276 sys 0m1.365s 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:16.276 10:22:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.276 ************************************ 00:15:16.276 END TEST raid_read_error_test 00:15:16.276 ************************************ 00:15:16.276 10:22:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:16.276 10:22:53 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:15:16.276 10:22:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:16.276 10:22:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:16.276 10:22:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:16.276 ************************************ 00:15:16.276 START TEST raid_write_error_test 00:15:16.276 ************************************ 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.3W2bLHzIOB 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=506815 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 506815 /var/tmp/spdk-raid.sock 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 506815 ']' 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:16.276 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:16.276 10:22:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.276 [2024-07-15 10:22:53.463058] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:15:16.276 [2024-07-15 10:22:53.463126] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid506815 ] 00:15:16.534 [2024-07-15 10:22:53.593833] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:16.534 [2024-07-15 10:22:53.691308] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:16.793 [2024-07-15 10:22:53.754106] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:16.793 [2024-07-15 10:22:53.754145] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:17.360 10:22:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:17.360 10:22:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:17.360 10:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:17.360 10:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:17.652 BaseBdev1_malloc 00:15:17.652 10:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:17.652 true 00:15:17.652 10:22:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:17.910 [2024-07-15 10:22:55.057706] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:17.910 [2024-07-15 10:22:55.057752] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:17.910 [2024-07-15 10:22:55.057774] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x115c0d0 00:15:17.910 [2024-07-15 10:22:55.057786] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:17.910 [2024-07-15 10:22:55.059577] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:17.910 [2024-07-15 10:22:55.059610] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:17.910 BaseBdev1 00:15:17.910 10:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:17.910 10:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:18.168 BaseBdev2_malloc 00:15:18.168 10:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:18.426 true 00:15:18.426 10:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:18.685 [2024-07-15 10:22:55.792375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:18.685 [2024-07-15 10:22:55.792429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:18.685 [2024-07-15 10:22:55.792450] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1160910 00:15:18.685 [2024-07-15 10:22:55.792463] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:18.685 [2024-07-15 10:22:55.793918] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:18.685 [2024-07-15 10:22:55.793955] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:18.685 BaseBdev2 00:15:18.685 10:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:18.685 10:22:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:18.943 BaseBdev3_malloc 00:15:18.943 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:19.201 true 00:15:19.201 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:19.458 [2024-07-15 10:22:56.530985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:19.458 [2024-07-15 10:22:56.531031] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:19.458 [2024-07-15 10:22:56.531051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1162bd0 00:15:19.458 [2024-07-15 10:22:56.531063] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:19.458 [2024-07-15 10:22:56.532459] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:19.459 [2024-07-15 10:22:56.532486] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:19.459 BaseBdev3 00:15:19.459 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:19.716 [2024-07-15 10:22:56.779675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:19.716 [2024-07-15 10:22:56.780946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:19.716 [2024-07-15 10:22:56.781013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:19.716 [2024-07-15 10:22:56.781214] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1164280 00:15:19.716 [2024-07-15 10:22:56.781226] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:19.716 [2024-07-15 10:22:56.781422] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1163e20 00:15:19.716 [2024-07-15 10:22:56.781564] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1164280 00:15:19.716 [2024-07-15 10:22:56.781574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1164280 00:15:19.716 [2024-07-15 10:22:56.781677] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:19.716 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:19.716 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:19.716 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:19.716 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.717 10:22:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:19.974 10:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.974 "name": "raid_bdev1", 00:15:19.974 "uuid": "833949ae-1e6e-465d-b57e-88df2b6eb334", 00:15:19.974 "strip_size_kb": 64, 00:15:19.974 "state": "online", 00:15:19.974 "raid_level": "raid0", 00:15:19.974 "superblock": true, 00:15:19.974 "num_base_bdevs": 3, 00:15:19.974 "num_base_bdevs_discovered": 3, 00:15:19.974 "num_base_bdevs_operational": 3, 00:15:19.974 "base_bdevs_list": [ 00:15:19.974 { 00:15:19.974 "name": "BaseBdev1", 00:15:19.974 "uuid": "e2ef1ad8-2166-5cbc-86ed-935040aee7bb", 00:15:19.974 "is_configured": true, 00:15:19.974 "data_offset": 2048, 00:15:19.974 "data_size": 63488 00:15:19.974 }, 00:15:19.974 { 00:15:19.974 "name": "BaseBdev2", 00:15:19.974 "uuid": "ef02a11b-3e01-5571-b611-dc444a81e767", 00:15:19.974 "is_configured": true, 00:15:19.974 "data_offset": 2048, 00:15:19.974 "data_size": 63488 00:15:19.974 }, 00:15:19.974 { 00:15:19.974 "name": "BaseBdev3", 00:15:19.974 "uuid": "e5974cd9-959e-5c83-a311-9ac15bf9f23e", 00:15:19.974 "is_configured": true, 00:15:19.974 "data_offset": 2048, 00:15:19.974 "data_size": 63488 00:15:19.974 } 00:15:19.974 ] 00:15:19.974 }' 00:15:19.974 10:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.974 10:22:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.540 10:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:20.540 10:22:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:20.798 [2024-07-15 10:22:57.858876] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb25b0 00:15:21.732 10:22:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.990 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:22.248 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.248 "name": "raid_bdev1", 00:15:22.248 "uuid": "833949ae-1e6e-465d-b57e-88df2b6eb334", 00:15:22.248 "strip_size_kb": 64, 00:15:22.248 "state": "online", 00:15:22.248 "raid_level": "raid0", 00:15:22.248 "superblock": true, 00:15:22.248 "num_base_bdevs": 3, 00:15:22.248 "num_base_bdevs_discovered": 3, 00:15:22.248 "num_base_bdevs_operational": 3, 00:15:22.248 "base_bdevs_list": [ 00:15:22.248 { 00:15:22.248 "name": "BaseBdev1", 00:15:22.248 "uuid": "e2ef1ad8-2166-5cbc-86ed-935040aee7bb", 00:15:22.248 "is_configured": true, 00:15:22.248 "data_offset": 2048, 00:15:22.248 "data_size": 63488 00:15:22.248 }, 00:15:22.248 { 00:15:22.248 "name": "BaseBdev2", 00:15:22.248 "uuid": "ef02a11b-3e01-5571-b611-dc444a81e767", 00:15:22.248 "is_configured": true, 00:15:22.248 "data_offset": 2048, 00:15:22.248 "data_size": 63488 00:15:22.248 }, 00:15:22.248 { 00:15:22.248 "name": "BaseBdev3", 00:15:22.248 "uuid": "e5974cd9-959e-5c83-a311-9ac15bf9f23e", 00:15:22.248 "is_configured": true, 00:15:22.248 "data_offset": 2048, 00:15:22.248 "data_size": 63488 00:15:22.248 } 00:15:22.248 ] 00:15:22.248 }' 00:15:22.248 10:22:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.248 10:22:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:23.184 [2024-07-15 10:23:00.241133] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:23.184 [2024-07-15 10:23:00.241173] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:23.184 [2024-07-15 10:23:00.244358] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:23.184 [2024-07-15 10:23:00.244397] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:23.184 [2024-07-15 10:23:00.244431] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:23.184 [2024-07-15 10:23:00.244443] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1164280 name raid_bdev1, state offline 00:15:23.184 0 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 506815 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 506815 ']' 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 506815 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 506815 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 506815' 00:15:23.184 killing process with pid 506815 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 506815 00:15:23.184 [2024-07-15 10:23:00.311015] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:23.184 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 506815 00:15:23.184 [2024-07-15 10:23:00.332791] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.3W2bLHzIOB 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:15:23.442 00:15:23.442 real 0m7.190s 00:15:23.442 user 0m11.555s 00:15:23.442 sys 0m1.261s 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:23.442 10:23:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.442 ************************************ 00:15:23.442 END TEST raid_write_error_test 00:15:23.442 ************************************ 00:15:23.442 10:23:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:23.442 10:23:00 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:23.442 10:23:00 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:15:23.442 10:23:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:23.442 10:23:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:23.442 10:23:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:23.701 ************************************ 00:15:23.701 START TEST raid_state_function_test 00:15:23.701 ************************************ 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=507894 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 507894' 00:15:23.701 Process raid pid: 507894 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 507894 /var/tmp/spdk-raid.sock 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 507894 ']' 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:23.701 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:23.701 10:23:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:23.701 [2024-07-15 10:23:00.725263] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:15:23.701 [2024-07-15 10:23:00.725313] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:23.701 [2024-07-15 10:23:00.838713] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:23.959 [2024-07-15 10:23:00.946899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:23.959 [2024-07-15 10:23:01.010108] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:23.959 [2024-07-15 10:23:01.010142] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:24.526 10:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:24.526 10:23:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:24.526 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:24.783 [2024-07-15 10:23:01.885509] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:24.783 [2024-07-15 10:23:01.885553] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:24.783 [2024-07-15 10:23:01.885564] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:24.783 [2024-07-15 10:23:01.885577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:24.783 [2024-07-15 10:23:01.885587] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:24.783 [2024-07-15 10:23:01.885599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.783 10:23:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.041 10:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.041 "name": "Existed_Raid", 00:15:25.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.041 "strip_size_kb": 64, 00:15:25.041 "state": "configuring", 00:15:25.041 "raid_level": "concat", 00:15:25.041 "superblock": false, 00:15:25.041 "num_base_bdevs": 3, 00:15:25.041 "num_base_bdevs_discovered": 0, 00:15:25.041 "num_base_bdevs_operational": 3, 00:15:25.041 "base_bdevs_list": [ 00:15:25.041 { 00:15:25.041 "name": "BaseBdev1", 00:15:25.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.041 "is_configured": false, 00:15:25.041 "data_offset": 0, 00:15:25.041 "data_size": 0 00:15:25.041 }, 00:15:25.041 { 00:15:25.041 "name": "BaseBdev2", 00:15:25.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.041 "is_configured": false, 00:15:25.041 "data_offset": 0, 00:15:25.041 "data_size": 0 00:15:25.041 }, 00:15:25.041 { 00:15:25.041 "name": "BaseBdev3", 00:15:25.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.041 "is_configured": false, 00:15:25.041 "data_offset": 0, 00:15:25.041 "data_size": 0 00:15:25.041 } 00:15:25.041 ] 00:15:25.041 }' 00:15:25.041 10:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.041 10:23:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.605 10:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:25.862 [2024-07-15 10:23:02.888035] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:25.862 [2024-07-15 10:23:02.888071] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d05a80 name Existed_Raid, state configuring 00:15:25.862 10:23:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:26.119 [2024-07-15 10:23:03.136706] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:26.119 [2024-07-15 10:23:03.136742] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:26.119 [2024-07-15 10:23:03.136752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:26.119 [2024-07-15 10:23:03.136764] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:26.119 [2024-07-15 10:23:03.136773] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:26.119 [2024-07-15 10:23:03.136784] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:26.119 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:26.376 [2024-07-15 10:23:03.391284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:26.376 BaseBdev1 00:15:26.376 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:26.376 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:26.376 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:26.376 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:26.376 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:26.376 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:26.376 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:26.634 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:26.892 [ 00:15:26.892 { 00:15:26.892 "name": "BaseBdev1", 00:15:26.892 "aliases": [ 00:15:26.892 "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad" 00:15:26.892 ], 00:15:26.892 "product_name": "Malloc disk", 00:15:26.892 "block_size": 512, 00:15:26.892 "num_blocks": 65536, 00:15:26.892 "uuid": "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad", 00:15:26.892 "assigned_rate_limits": { 00:15:26.892 "rw_ios_per_sec": 0, 00:15:26.892 "rw_mbytes_per_sec": 0, 00:15:26.892 "r_mbytes_per_sec": 0, 00:15:26.892 "w_mbytes_per_sec": 0 00:15:26.892 }, 00:15:26.892 "claimed": true, 00:15:26.892 "claim_type": "exclusive_write", 00:15:26.892 "zoned": false, 00:15:26.892 "supported_io_types": { 00:15:26.892 "read": true, 00:15:26.892 "write": true, 00:15:26.892 "unmap": true, 00:15:26.892 "flush": true, 00:15:26.892 "reset": true, 00:15:26.892 "nvme_admin": false, 00:15:26.892 "nvme_io": false, 00:15:26.892 "nvme_io_md": false, 00:15:26.892 "write_zeroes": true, 00:15:26.892 "zcopy": true, 00:15:26.892 "get_zone_info": false, 00:15:26.892 "zone_management": false, 00:15:26.892 "zone_append": false, 00:15:26.892 "compare": false, 00:15:26.892 "compare_and_write": false, 00:15:26.892 "abort": true, 00:15:26.892 "seek_hole": false, 00:15:26.892 "seek_data": false, 00:15:26.892 "copy": true, 00:15:26.892 "nvme_iov_md": false 00:15:26.892 }, 00:15:26.892 "memory_domains": [ 00:15:26.892 { 00:15:26.892 "dma_device_id": "system", 00:15:26.892 "dma_device_type": 1 00:15:26.892 }, 00:15:26.892 { 00:15:26.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.892 "dma_device_type": 2 00:15:26.892 } 00:15:26.892 ], 00:15:26.892 "driver_specific": {} 00:15:26.892 } 00:15:26.892 ] 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.892 10:23:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.150 10:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.150 "name": "Existed_Raid", 00:15:27.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.150 "strip_size_kb": 64, 00:15:27.150 "state": "configuring", 00:15:27.150 "raid_level": "concat", 00:15:27.150 "superblock": false, 00:15:27.150 "num_base_bdevs": 3, 00:15:27.150 "num_base_bdevs_discovered": 1, 00:15:27.150 "num_base_bdevs_operational": 3, 00:15:27.150 "base_bdevs_list": [ 00:15:27.150 { 00:15:27.150 "name": "BaseBdev1", 00:15:27.150 "uuid": "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad", 00:15:27.150 "is_configured": true, 00:15:27.150 "data_offset": 0, 00:15:27.150 "data_size": 65536 00:15:27.150 }, 00:15:27.150 { 00:15:27.150 "name": "BaseBdev2", 00:15:27.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.150 "is_configured": false, 00:15:27.150 "data_offset": 0, 00:15:27.150 "data_size": 0 00:15:27.150 }, 00:15:27.150 { 00:15:27.150 "name": "BaseBdev3", 00:15:27.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:27.150 "is_configured": false, 00:15:27.150 "data_offset": 0, 00:15:27.150 "data_size": 0 00:15:27.150 } 00:15:27.150 ] 00:15:27.150 }' 00:15:27.150 10:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.150 10:23:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:27.716 10:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:27.972 [2024-07-15 10:23:04.975458] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:27.972 [2024-07-15 10:23:04.975500] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d05310 name Existed_Raid, state configuring 00:15:27.973 10:23:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:28.230 [2024-07-15 10:23:05.220134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:28.230 [2024-07-15 10:23:05.221596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:28.230 [2024-07-15 10:23:05.221633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:28.230 [2024-07-15 10:23:05.221643] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:28.230 [2024-07-15 10:23:05.221655] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.230 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.487 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.487 "name": "Existed_Raid", 00:15:28.487 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.487 "strip_size_kb": 64, 00:15:28.487 "state": "configuring", 00:15:28.487 "raid_level": "concat", 00:15:28.487 "superblock": false, 00:15:28.487 "num_base_bdevs": 3, 00:15:28.487 "num_base_bdevs_discovered": 1, 00:15:28.487 "num_base_bdevs_operational": 3, 00:15:28.487 "base_bdevs_list": [ 00:15:28.487 { 00:15:28.487 "name": "BaseBdev1", 00:15:28.487 "uuid": "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad", 00:15:28.487 "is_configured": true, 00:15:28.487 "data_offset": 0, 00:15:28.487 "data_size": 65536 00:15:28.487 }, 00:15:28.487 { 00:15:28.487 "name": "BaseBdev2", 00:15:28.487 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.487 "is_configured": false, 00:15:28.487 "data_offset": 0, 00:15:28.487 "data_size": 0 00:15:28.487 }, 00:15:28.487 { 00:15:28.488 "name": "BaseBdev3", 00:15:28.488 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.488 "is_configured": false, 00:15:28.488 "data_offset": 0, 00:15:28.488 "data_size": 0 00:15:28.488 } 00:15:28.488 ] 00:15:28.488 }' 00:15:28.488 10:23:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.488 10:23:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.051 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:29.307 [2024-07-15 10:23:06.326495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:29.307 BaseBdev2 00:15:29.307 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:29.307 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:29.307 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:29.307 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:29.307 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:29.307 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:29.307 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.565 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:29.822 [ 00:15:29.822 { 00:15:29.822 "name": "BaseBdev2", 00:15:29.822 "aliases": [ 00:15:29.822 "acbbf783-f728-4f59-a74b-40663e2aee22" 00:15:29.822 ], 00:15:29.822 "product_name": "Malloc disk", 00:15:29.822 "block_size": 512, 00:15:29.822 "num_blocks": 65536, 00:15:29.822 "uuid": "acbbf783-f728-4f59-a74b-40663e2aee22", 00:15:29.823 "assigned_rate_limits": { 00:15:29.823 "rw_ios_per_sec": 0, 00:15:29.823 "rw_mbytes_per_sec": 0, 00:15:29.823 "r_mbytes_per_sec": 0, 00:15:29.823 "w_mbytes_per_sec": 0 00:15:29.823 }, 00:15:29.823 "claimed": true, 00:15:29.823 "claim_type": "exclusive_write", 00:15:29.823 "zoned": false, 00:15:29.823 "supported_io_types": { 00:15:29.823 "read": true, 00:15:29.823 "write": true, 00:15:29.823 "unmap": true, 00:15:29.823 "flush": true, 00:15:29.823 "reset": true, 00:15:29.823 "nvme_admin": false, 00:15:29.823 "nvme_io": false, 00:15:29.823 "nvme_io_md": false, 00:15:29.823 "write_zeroes": true, 00:15:29.823 "zcopy": true, 00:15:29.823 "get_zone_info": false, 00:15:29.823 "zone_management": false, 00:15:29.823 "zone_append": false, 00:15:29.823 "compare": false, 00:15:29.823 "compare_and_write": false, 00:15:29.823 "abort": true, 00:15:29.823 "seek_hole": false, 00:15:29.823 "seek_data": false, 00:15:29.823 "copy": true, 00:15:29.823 "nvme_iov_md": false 00:15:29.823 }, 00:15:29.823 "memory_domains": [ 00:15:29.823 { 00:15:29.823 "dma_device_id": "system", 00:15:29.823 "dma_device_type": 1 00:15:29.823 }, 00:15:29.823 { 00:15:29.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.823 "dma_device_type": 2 00:15:29.823 } 00:15:29.823 ], 00:15:29.823 "driver_specific": {} 00:15:29.823 } 00:15:29.823 ] 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.823 10:23:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.079 10:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.079 "name": "Existed_Raid", 00:15:30.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.079 "strip_size_kb": 64, 00:15:30.079 "state": "configuring", 00:15:30.079 "raid_level": "concat", 00:15:30.079 "superblock": false, 00:15:30.079 "num_base_bdevs": 3, 00:15:30.079 "num_base_bdevs_discovered": 2, 00:15:30.079 "num_base_bdevs_operational": 3, 00:15:30.079 "base_bdevs_list": [ 00:15:30.079 { 00:15:30.079 "name": "BaseBdev1", 00:15:30.079 "uuid": "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad", 00:15:30.079 "is_configured": true, 00:15:30.079 "data_offset": 0, 00:15:30.079 "data_size": 65536 00:15:30.079 }, 00:15:30.079 { 00:15:30.079 "name": "BaseBdev2", 00:15:30.079 "uuid": "acbbf783-f728-4f59-a74b-40663e2aee22", 00:15:30.079 "is_configured": true, 00:15:30.079 "data_offset": 0, 00:15:30.079 "data_size": 65536 00:15:30.079 }, 00:15:30.079 { 00:15:30.079 "name": "BaseBdev3", 00:15:30.079 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.079 "is_configured": false, 00:15:30.079 "data_offset": 0, 00:15:30.079 "data_size": 0 00:15:30.079 } 00:15:30.079 ] 00:15:30.079 }' 00:15:30.079 10:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.079 10:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.644 10:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:30.903 [2024-07-15 10:23:07.882071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:30.903 [2024-07-15 10:23:07.882111] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d06400 00:15:30.903 [2024-07-15 10:23:07.882120] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:30.903 [2024-07-15 10:23:07.882371] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d05ef0 00:15:30.903 [2024-07-15 10:23:07.882491] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d06400 00:15:30.903 [2024-07-15 10:23:07.882501] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d06400 00:15:30.903 [2024-07-15 10:23:07.882666] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:30.903 BaseBdev3 00:15:30.903 10:23:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:30.903 10:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:30.903 10:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:30.903 10:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:30.903 10:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:30.903 10:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:30.903 10:23:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:31.161 10:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:31.419 [ 00:15:31.419 { 00:15:31.419 "name": "BaseBdev3", 00:15:31.419 "aliases": [ 00:15:31.419 "acd433b7-8d45-42dc-8bf6-dff5a2377fd0" 00:15:31.419 ], 00:15:31.419 "product_name": "Malloc disk", 00:15:31.419 "block_size": 512, 00:15:31.419 "num_blocks": 65536, 00:15:31.419 "uuid": "acd433b7-8d45-42dc-8bf6-dff5a2377fd0", 00:15:31.419 "assigned_rate_limits": { 00:15:31.419 "rw_ios_per_sec": 0, 00:15:31.419 "rw_mbytes_per_sec": 0, 00:15:31.419 "r_mbytes_per_sec": 0, 00:15:31.419 "w_mbytes_per_sec": 0 00:15:31.419 }, 00:15:31.419 "claimed": true, 00:15:31.419 "claim_type": "exclusive_write", 00:15:31.419 "zoned": false, 00:15:31.419 "supported_io_types": { 00:15:31.419 "read": true, 00:15:31.419 "write": true, 00:15:31.419 "unmap": true, 00:15:31.419 "flush": true, 00:15:31.419 "reset": true, 00:15:31.419 "nvme_admin": false, 00:15:31.419 "nvme_io": false, 00:15:31.419 "nvme_io_md": false, 00:15:31.419 "write_zeroes": true, 00:15:31.419 "zcopy": true, 00:15:31.419 "get_zone_info": false, 00:15:31.419 "zone_management": false, 00:15:31.419 "zone_append": false, 00:15:31.419 "compare": false, 00:15:31.419 "compare_and_write": false, 00:15:31.419 "abort": true, 00:15:31.419 "seek_hole": false, 00:15:31.419 "seek_data": false, 00:15:31.419 "copy": true, 00:15:31.419 "nvme_iov_md": false 00:15:31.419 }, 00:15:31.419 "memory_domains": [ 00:15:31.419 { 00:15:31.419 "dma_device_id": "system", 00:15:31.419 "dma_device_type": 1 00:15:31.419 }, 00:15:31.419 { 00:15:31.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.419 "dma_device_type": 2 00:15:31.419 } 00:15:31.419 ], 00:15:31.419 "driver_specific": {} 00:15:31.419 } 00:15:31.419 ] 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.419 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.677 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.677 "name": "Existed_Raid", 00:15:31.677 "uuid": "8d666e93-9e3b-47db-8594-bee2922ccd52", 00:15:31.677 "strip_size_kb": 64, 00:15:31.677 "state": "online", 00:15:31.677 "raid_level": "concat", 00:15:31.677 "superblock": false, 00:15:31.677 "num_base_bdevs": 3, 00:15:31.677 "num_base_bdevs_discovered": 3, 00:15:31.677 "num_base_bdevs_operational": 3, 00:15:31.677 "base_bdevs_list": [ 00:15:31.677 { 00:15:31.677 "name": "BaseBdev1", 00:15:31.677 "uuid": "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad", 00:15:31.677 "is_configured": true, 00:15:31.677 "data_offset": 0, 00:15:31.677 "data_size": 65536 00:15:31.677 }, 00:15:31.677 { 00:15:31.677 "name": "BaseBdev2", 00:15:31.677 "uuid": "acbbf783-f728-4f59-a74b-40663e2aee22", 00:15:31.677 "is_configured": true, 00:15:31.677 "data_offset": 0, 00:15:31.677 "data_size": 65536 00:15:31.677 }, 00:15:31.677 { 00:15:31.677 "name": "BaseBdev3", 00:15:31.677 "uuid": "acd433b7-8d45-42dc-8bf6-dff5a2377fd0", 00:15:31.677 "is_configured": true, 00:15:31.677 "data_offset": 0, 00:15:31.677 "data_size": 65536 00:15:31.677 } 00:15:31.677 ] 00:15:31.677 }' 00:15:31.677 10:23:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.677 10:23:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:32.271 [2024-07-15 10:23:09.446548] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:32.271 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:32.271 "name": "Existed_Raid", 00:15:32.271 "aliases": [ 00:15:32.271 "8d666e93-9e3b-47db-8594-bee2922ccd52" 00:15:32.271 ], 00:15:32.271 "product_name": "Raid Volume", 00:15:32.271 "block_size": 512, 00:15:32.271 "num_blocks": 196608, 00:15:32.271 "uuid": "8d666e93-9e3b-47db-8594-bee2922ccd52", 00:15:32.271 "assigned_rate_limits": { 00:15:32.271 "rw_ios_per_sec": 0, 00:15:32.271 "rw_mbytes_per_sec": 0, 00:15:32.271 "r_mbytes_per_sec": 0, 00:15:32.271 "w_mbytes_per_sec": 0 00:15:32.271 }, 00:15:32.271 "claimed": false, 00:15:32.271 "zoned": false, 00:15:32.271 "supported_io_types": { 00:15:32.271 "read": true, 00:15:32.271 "write": true, 00:15:32.271 "unmap": true, 00:15:32.271 "flush": true, 00:15:32.271 "reset": true, 00:15:32.271 "nvme_admin": false, 00:15:32.271 "nvme_io": false, 00:15:32.271 "nvme_io_md": false, 00:15:32.271 "write_zeroes": true, 00:15:32.271 "zcopy": false, 00:15:32.271 "get_zone_info": false, 00:15:32.271 "zone_management": false, 00:15:32.271 "zone_append": false, 00:15:32.271 "compare": false, 00:15:32.271 "compare_and_write": false, 00:15:32.271 "abort": false, 00:15:32.271 "seek_hole": false, 00:15:32.271 "seek_data": false, 00:15:32.271 "copy": false, 00:15:32.271 "nvme_iov_md": false 00:15:32.271 }, 00:15:32.271 "memory_domains": [ 00:15:32.271 { 00:15:32.271 "dma_device_id": "system", 00:15:32.271 "dma_device_type": 1 00:15:32.271 }, 00:15:32.271 { 00:15:32.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.271 "dma_device_type": 2 00:15:32.271 }, 00:15:32.271 { 00:15:32.271 "dma_device_id": "system", 00:15:32.271 "dma_device_type": 1 00:15:32.271 }, 00:15:32.271 { 00:15:32.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.271 "dma_device_type": 2 00:15:32.271 }, 00:15:32.271 { 00:15:32.271 "dma_device_id": "system", 00:15:32.271 "dma_device_type": 1 00:15:32.271 }, 00:15:32.271 { 00:15:32.271 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.271 "dma_device_type": 2 00:15:32.271 } 00:15:32.271 ], 00:15:32.271 "driver_specific": { 00:15:32.271 "raid": { 00:15:32.271 "uuid": "8d666e93-9e3b-47db-8594-bee2922ccd52", 00:15:32.271 "strip_size_kb": 64, 00:15:32.271 "state": "online", 00:15:32.271 "raid_level": "concat", 00:15:32.272 "superblock": false, 00:15:32.272 "num_base_bdevs": 3, 00:15:32.272 "num_base_bdevs_discovered": 3, 00:15:32.272 "num_base_bdevs_operational": 3, 00:15:32.272 "base_bdevs_list": [ 00:15:32.272 { 00:15:32.272 "name": "BaseBdev1", 00:15:32.272 "uuid": "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad", 00:15:32.272 "is_configured": true, 00:15:32.272 "data_offset": 0, 00:15:32.272 "data_size": 65536 00:15:32.272 }, 00:15:32.272 { 00:15:32.272 "name": "BaseBdev2", 00:15:32.272 "uuid": "acbbf783-f728-4f59-a74b-40663e2aee22", 00:15:32.272 "is_configured": true, 00:15:32.272 "data_offset": 0, 00:15:32.272 "data_size": 65536 00:15:32.272 }, 00:15:32.272 { 00:15:32.272 "name": "BaseBdev3", 00:15:32.272 "uuid": "acd433b7-8d45-42dc-8bf6-dff5a2377fd0", 00:15:32.272 "is_configured": true, 00:15:32.272 "data_offset": 0, 00:15:32.272 "data_size": 65536 00:15:32.272 } 00:15:32.272 ] 00:15:32.272 } 00:15:32.272 } 00:15:32.272 }' 00:15:32.529 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:32.529 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:32.529 BaseBdev2 00:15:32.529 BaseBdev3' 00:15:32.529 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.529 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:32.529 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.787 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.788 "name": "BaseBdev1", 00:15:32.788 "aliases": [ 00:15:32.788 "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad" 00:15:32.788 ], 00:15:32.788 "product_name": "Malloc disk", 00:15:32.788 "block_size": 512, 00:15:32.788 "num_blocks": 65536, 00:15:32.788 "uuid": "a36f72bd-38c3-4bdf-a62d-aca7a8d6a0ad", 00:15:32.788 "assigned_rate_limits": { 00:15:32.788 "rw_ios_per_sec": 0, 00:15:32.788 "rw_mbytes_per_sec": 0, 00:15:32.788 "r_mbytes_per_sec": 0, 00:15:32.788 "w_mbytes_per_sec": 0 00:15:32.788 }, 00:15:32.788 "claimed": true, 00:15:32.788 "claim_type": "exclusive_write", 00:15:32.788 "zoned": false, 00:15:32.788 "supported_io_types": { 00:15:32.788 "read": true, 00:15:32.788 "write": true, 00:15:32.788 "unmap": true, 00:15:32.788 "flush": true, 00:15:32.788 "reset": true, 00:15:32.788 "nvme_admin": false, 00:15:32.788 "nvme_io": false, 00:15:32.788 "nvme_io_md": false, 00:15:32.788 "write_zeroes": true, 00:15:32.788 "zcopy": true, 00:15:32.788 "get_zone_info": false, 00:15:32.788 "zone_management": false, 00:15:32.788 "zone_append": false, 00:15:32.788 "compare": false, 00:15:32.788 "compare_and_write": false, 00:15:32.788 "abort": true, 00:15:32.788 "seek_hole": false, 00:15:32.788 "seek_data": false, 00:15:32.788 "copy": true, 00:15:32.788 "nvme_iov_md": false 00:15:32.788 }, 00:15:32.788 "memory_domains": [ 00:15:32.788 { 00:15:32.788 "dma_device_id": "system", 00:15:32.788 "dma_device_type": 1 00:15:32.788 }, 00:15:32.788 { 00:15:32.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.788 "dma_device_type": 2 00:15:32.788 } 00:15:32.788 ], 00:15:32.788 "driver_specific": {} 00:15:32.788 }' 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.788 10:23:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.047 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.047 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.047 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.047 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.047 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.047 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.047 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:33.305 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.305 "name": "BaseBdev2", 00:15:33.305 "aliases": [ 00:15:33.305 "acbbf783-f728-4f59-a74b-40663e2aee22" 00:15:33.305 ], 00:15:33.305 "product_name": "Malloc disk", 00:15:33.305 "block_size": 512, 00:15:33.305 "num_blocks": 65536, 00:15:33.305 "uuid": "acbbf783-f728-4f59-a74b-40663e2aee22", 00:15:33.305 "assigned_rate_limits": { 00:15:33.305 "rw_ios_per_sec": 0, 00:15:33.305 "rw_mbytes_per_sec": 0, 00:15:33.305 "r_mbytes_per_sec": 0, 00:15:33.305 "w_mbytes_per_sec": 0 00:15:33.305 }, 00:15:33.305 "claimed": true, 00:15:33.305 "claim_type": "exclusive_write", 00:15:33.305 "zoned": false, 00:15:33.305 "supported_io_types": { 00:15:33.305 "read": true, 00:15:33.305 "write": true, 00:15:33.305 "unmap": true, 00:15:33.305 "flush": true, 00:15:33.305 "reset": true, 00:15:33.305 "nvme_admin": false, 00:15:33.305 "nvme_io": false, 00:15:33.305 "nvme_io_md": false, 00:15:33.305 "write_zeroes": true, 00:15:33.305 "zcopy": true, 00:15:33.305 "get_zone_info": false, 00:15:33.305 "zone_management": false, 00:15:33.305 "zone_append": false, 00:15:33.305 "compare": false, 00:15:33.305 "compare_and_write": false, 00:15:33.305 "abort": true, 00:15:33.305 "seek_hole": false, 00:15:33.305 "seek_data": false, 00:15:33.305 "copy": true, 00:15:33.305 "nvme_iov_md": false 00:15:33.305 }, 00:15:33.305 "memory_domains": [ 00:15:33.305 { 00:15:33.305 "dma_device_id": "system", 00:15:33.305 "dma_device_type": 1 00:15:33.305 }, 00:15:33.305 { 00:15:33.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.305 "dma_device_type": 2 00:15:33.305 } 00:15:33.305 ], 00:15:33.305 "driver_specific": {} 00:15:33.305 }' 00:15:33.305 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.305 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.305 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.305 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.305 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:33.563 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:33.821 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.821 "name": "BaseBdev3", 00:15:33.821 "aliases": [ 00:15:33.821 "acd433b7-8d45-42dc-8bf6-dff5a2377fd0" 00:15:33.821 ], 00:15:33.821 "product_name": "Malloc disk", 00:15:33.821 "block_size": 512, 00:15:33.821 "num_blocks": 65536, 00:15:33.821 "uuid": "acd433b7-8d45-42dc-8bf6-dff5a2377fd0", 00:15:33.821 "assigned_rate_limits": { 00:15:33.821 "rw_ios_per_sec": 0, 00:15:33.821 "rw_mbytes_per_sec": 0, 00:15:33.821 "r_mbytes_per_sec": 0, 00:15:33.821 "w_mbytes_per_sec": 0 00:15:33.821 }, 00:15:33.821 "claimed": true, 00:15:33.821 "claim_type": "exclusive_write", 00:15:33.821 "zoned": false, 00:15:33.821 "supported_io_types": { 00:15:33.821 "read": true, 00:15:33.821 "write": true, 00:15:33.821 "unmap": true, 00:15:33.821 "flush": true, 00:15:33.821 "reset": true, 00:15:33.821 "nvme_admin": false, 00:15:33.821 "nvme_io": false, 00:15:33.821 "nvme_io_md": false, 00:15:33.821 "write_zeroes": true, 00:15:33.821 "zcopy": true, 00:15:33.821 "get_zone_info": false, 00:15:33.821 "zone_management": false, 00:15:33.821 "zone_append": false, 00:15:33.821 "compare": false, 00:15:33.821 "compare_and_write": false, 00:15:33.821 "abort": true, 00:15:33.821 "seek_hole": false, 00:15:33.821 "seek_data": false, 00:15:33.821 "copy": true, 00:15:33.821 "nvme_iov_md": false 00:15:33.821 }, 00:15:33.821 "memory_domains": [ 00:15:33.821 { 00:15:33.821 "dma_device_id": "system", 00:15:33.821 "dma_device_type": 1 00:15:33.821 }, 00:15:33.821 { 00:15:33.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.821 "dma_device_type": 2 00:15:33.821 } 00:15:33.821 ], 00:15:33.821 "driver_specific": {} 00:15:33.821 }' 00:15:33.821 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.821 10:23:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:34.078 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:34.336 [2024-07-15 10:23:11.499949] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:34.336 [2024-07-15 10:23:11.499979] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:34.336 [2024-07-15 10:23:11.500020] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.336 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.594 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.594 "name": "Existed_Raid", 00:15:34.594 "uuid": "8d666e93-9e3b-47db-8594-bee2922ccd52", 00:15:34.594 "strip_size_kb": 64, 00:15:34.594 "state": "offline", 00:15:34.594 "raid_level": "concat", 00:15:34.594 "superblock": false, 00:15:34.594 "num_base_bdevs": 3, 00:15:34.594 "num_base_bdevs_discovered": 2, 00:15:34.594 "num_base_bdevs_operational": 2, 00:15:34.594 "base_bdevs_list": [ 00:15:34.594 { 00:15:34.594 "name": null, 00:15:34.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.594 "is_configured": false, 00:15:34.594 "data_offset": 0, 00:15:34.594 "data_size": 65536 00:15:34.594 }, 00:15:34.594 { 00:15:34.594 "name": "BaseBdev2", 00:15:34.594 "uuid": "acbbf783-f728-4f59-a74b-40663e2aee22", 00:15:34.594 "is_configured": true, 00:15:34.594 "data_offset": 0, 00:15:34.594 "data_size": 65536 00:15:34.594 }, 00:15:34.594 { 00:15:34.594 "name": "BaseBdev3", 00:15:34.594 "uuid": "acd433b7-8d45-42dc-8bf6-dff5a2377fd0", 00:15:34.594 "is_configured": true, 00:15:34.594 "data_offset": 0, 00:15:34.594 "data_size": 65536 00:15:34.594 } 00:15:34.594 ] 00:15:34.594 }' 00:15:34.594 10:23:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.594 10:23:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.529 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:35.529 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:35.529 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.529 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:35.529 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:35.529 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:35.529 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:35.788 [2024-07-15 10:23:12.784396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:35.788 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:35.788 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:35.788 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.788 10:23:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:36.046 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:36.046 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:36.046 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:36.304 [2024-07-15 10:23:13.296498] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:36.304 [2024-07-15 10:23:13.296550] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d06400 name Existed_Raid, state offline 00:15:36.304 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:36.304 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:36.304 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.304 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:36.562 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:36.562 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:36.562 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:36.562 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:36.562 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:36.562 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:36.821 BaseBdev2 00:15:36.821 10:23:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:36.821 10:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:36.821 10:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:36.821 10:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:36.821 10:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:36.821 10:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:36.821 10:23:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:37.079 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:37.337 [ 00:15:37.337 { 00:15:37.337 "name": "BaseBdev2", 00:15:37.337 "aliases": [ 00:15:37.337 "23083494-d614-4837-af20-744b061bdcb9" 00:15:37.337 ], 00:15:37.337 "product_name": "Malloc disk", 00:15:37.337 "block_size": 512, 00:15:37.337 "num_blocks": 65536, 00:15:37.337 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:37.337 "assigned_rate_limits": { 00:15:37.337 "rw_ios_per_sec": 0, 00:15:37.337 "rw_mbytes_per_sec": 0, 00:15:37.337 "r_mbytes_per_sec": 0, 00:15:37.337 "w_mbytes_per_sec": 0 00:15:37.337 }, 00:15:37.337 "claimed": false, 00:15:37.337 "zoned": false, 00:15:37.337 "supported_io_types": { 00:15:37.337 "read": true, 00:15:37.337 "write": true, 00:15:37.337 "unmap": true, 00:15:37.337 "flush": true, 00:15:37.337 "reset": true, 00:15:37.337 "nvme_admin": false, 00:15:37.337 "nvme_io": false, 00:15:37.337 "nvme_io_md": false, 00:15:37.337 "write_zeroes": true, 00:15:37.337 "zcopy": true, 00:15:37.337 "get_zone_info": false, 00:15:37.337 "zone_management": false, 00:15:37.337 "zone_append": false, 00:15:37.337 "compare": false, 00:15:37.337 "compare_and_write": false, 00:15:37.337 "abort": true, 00:15:37.337 "seek_hole": false, 00:15:37.337 "seek_data": false, 00:15:37.337 "copy": true, 00:15:37.337 "nvme_iov_md": false 00:15:37.337 }, 00:15:37.337 "memory_domains": [ 00:15:37.337 { 00:15:37.337 "dma_device_id": "system", 00:15:37.337 "dma_device_type": 1 00:15:37.337 }, 00:15:37.337 { 00:15:37.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.337 "dma_device_type": 2 00:15:37.337 } 00:15:37.337 ], 00:15:37.337 "driver_specific": {} 00:15:37.337 } 00:15:37.337 ] 00:15:37.337 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:37.337 10:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:37.337 10:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:37.337 10:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:37.337 BaseBdev3 00:15:37.596 10:23:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:37.596 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:37.596 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:37.596 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:37.596 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:37.596 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:37.596 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:37.854 10:23:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:37.854 [ 00:15:37.854 { 00:15:37.854 "name": "BaseBdev3", 00:15:37.854 "aliases": [ 00:15:37.854 "6d7e15ed-aa27-43fc-b95f-747660e053ec" 00:15:37.854 ], 00:15:37.854 "product_name": "Malloc disk", 00:15:37.854 "block_size": 512, 00:15:37.854 "num_blocks": 65536, 00:15:37.854 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:37.854 "assigned_rate_limits": { 00:15:37.854 "rw_ios_per_sec": 0, 00:15:37.854 "rw_mbytes_per_sec": 0, 00:15:37.854 "r_mbytes_per_sec": 0, 00:15:37.854 "w_mbytes_per_sec": 0 00:15:37.854 }, 00:15:37.854 "claimed": false, 00:15:37.854 "zoned": false, 00:15:37.854 "supported_io_types": { 00:15:37.854 "read": true, 00:15:37.854 "write": true, 00:15:37.854 "unmap": true, 00:15:37.854 "flush": true, 00:15:37.854 "reset": true, 00:15:37.854 "nvme_admin": false, 00:15:37.854 "nvme_io": false, 00:15:37.854 "nvme_io_md": false, 00:15:37.854 "write_zeroes": true, 00:15:37.854 "zcopy": true, 00:15:37.854 "get_zone_info": false, 00:15:37.854 "zone_management": false, 00:15:37.854 "zone_append": false, 00:15:37.854 "compare": false, 00:15:37.854 "compare_and_write": false, 00:15:37.854 "abort": true, 00:15:37.854 "seek_hole": false, 00:15:37.854 "seek_data": false, 00:15:37.854 "copy": true, 00:15:37.854 "nvme_iov_md": false 00:15:37.854 }, 00:15:37.854 "memory_domains": [ 00:15:37.854 { 00:15:37.854 "dma_device_id": "system", 00:15:37.854 "dma_device_type": 1 00:15:37.854 }, 00:15:37.854 { 00:15:37.854 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.854 "dma_device_type": 2 00:15:37.854 } 00:15:37.854 ], 00:15:37.854 "driver_specific": {} 00:15:37.854 } 00:15:37.854 ] 00:15:37.854 10:23:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:37.854 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:37.854 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:37.854 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:38.112 [2024-07-15 10:23:15.264246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:38.112 [2024-07-15 10:23:15.264294] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:38.112 [2024-07-15 10:23:15.264319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:38.112 [2024-07-15 10:23:15.265710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:38.112 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:38.112 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.112 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.112 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.113 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.370 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.370 "name": "Existed_Raid", 00:15:38.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.370 "strip_size_kb": 64, 00:15:38.370 "state": "configuring", 00:15:38.370 "raid_level": "concat", 00:15:38.370 "superblock": false, 00:15:38.370 "num_base_bdevs": 3, 00:15:38.370 "num_base_bdevs_discovered": 2, 00:15:38.370 "num_base_bdevs_operational": 3, 00:15:38.370 "base_bdevs_list": [ 00:15:38.370 { 00:15:38.370 "name": "BaseBdev1", 00:15:38.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.370 "is_configured": false, 00:15:38.370 "data_offset": 0, 00:15:38.370 "data_size": 0 00:15:38.370 }, 00:15:38.370 { 00:15:38.370 "name": "BaseBdev2", 00:15:38.370 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:38.370 "is_configured": true, 00:15:38.370 "data_offset": 0, 00:15:38.370 "data_size": 65536 00:15:38.370 }, 00:15:38.370 { 00:15:38.370 "name": "BaseBdev3", 00:15:38.370 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:38.370 "is_configured": true, 00:15:38.370 "data_offset": 0, 00:15:38.370 "data_size": 65536 00:15:38.370 } 00:15:38.370 ] 00:15:38.370 }' 00:15:38.370 10:23:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.370 10:23:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.935 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:39.194 [2024-07-15 10:23:16.343090] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.194 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:39.451 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:39.451 "name": "Existed_Raid", 00:15:39.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.451 "strip_size_kb": 64, 00:15:39.451 "state": "configuring", 00:15:39.451 "raid_level": "concat", 00:15:39.451 "superblock": false, 00:15:39.451 "num_base_bdevs": 3, 00:15:39.451 "num_base_bdevs_discovered": 1, 00:15:39.451 "num_base_bdevs_operational": 3, 00:15:39.451 "base_bdevs_list": [ 00:15:39.451 { 00:15:39.451 "name": "BaseBdev1", 00:15:39.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:39.451 "is_configured": false, 00:15:39.451 "data_offset": 0, 00:15:39.451 "data_size": 0 00:15:39.451 }, 00:15:39.451 { 00:15:39.451 "name": null, 00:15:39.451 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:39.451 "is_configured": false, 00:15:39.451 "data_offset": 0, 00:15:39.451 "data_size": 65536 00:15:39.451 }, 00:15:39.451 { 00:15:39.451 "name": "BaseBdev3", 00:15:39.451 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:39.451 "is_configured": true, 00:15:39.451 "data_offset": 0, 00:15:39.451 "data_size": 65536 00:15:39.451 } 00:15:39.451 ] 00:15:39.451 }' 00:15:39.451 10:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:39.451 10:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.384 10:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.384 10:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:40.384 10:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:40.384 10:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:40.641 [2024-07-15 10:23:17.691296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:40.641 BaseBdev1 00:15:40.641 10:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:40.641 10:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:40.641 10:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:40.641 10:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:40.641 10:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:40.641 10:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:40.641 10:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:40.899 10:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:41.157 [ 00:15:41.157 { 00:15:41.157 "name": "BaseBdev1", 00:15:41.157 "aliases": [ 00:15:41.157 "cf1b99e3-aac9-4787-83bd-bc56b50b62ba" 00:15:41.157 ], 00:15:41.157 "product_name": "Malloc disk", 00:15:41.157 "block_size": 512, 00:15:41.157 "num_blocks": 65536, 00:15:41.157 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:41.157 "assigned_rate_limits": { 00:15:41.157 "rw_ios_per_sec": 0, 00:15:41.157 "rw_mbytes_per_sec": 0, 00:15:41.157 "r_mbytes_per_sec": 0, 00:15:41.157 "w_mbytes_per_sec": 0 00:15:41.157 }, 00:15:41.157 "claimed": true, 00:15:41.157 "claim_type": "exclusive_write", 00:15:41.157 "zoned": false, 00:15:41.157 "supported_io_types": { 00:15:41.157 "read": true, 00:15:41.157 "write": true, 00:15:41.157 "unmap": true, 00:15:41.157 "flush": true, 00:15:41.157 "reset": true, 00:15:41.157 "nvme_admin": false, 00:15:41.157 "nvme_io": false, 00:15:41.157 "nvme_io_md": false, 00:15:41.157 "write_zeroes": true, 00:15:41.157 "zcopy": true, 00:15:41.157 "get_zone_info": false, 00:15:41.157 "zone_management": false, 00:15:41.157 "zone_append": false, 00:15:41.157 "compare": false, 00:15:41.157 "compare_and_write": false, 00:15:41.157 "abort": true, 00:15:41.157 "seek_hole": false, 00:15:41.157 "seek_data": false, 00:15:41.157 "copy": true, 00:15:41.157 "nvme_iov_md": false 00:15:41.157 }, 00:15:41.157 "memory_domains": [ 00:15:41.157 { 00:15:41.157 "dma_device_id": "system", 00:15:41.157 "dma_device_type": 1 00:15:41.157 }, 00:15:41.157 { 00:15:41.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.157 "dma_device_type": 2 00:15:41.157 } 00:15:41.157 ], 00:15:41.157 "driver_specific": {} 00:15:41.157 } 00:15:41.157 ] 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.157 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:41.414 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.414 "name": "Existed_Raid", 00:15:41.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:41.414 "strip_size_kb": 64, 00:15:41.414 "state": "configuring", 00:15:41.414 "raid_level": "concat", 00:15:41.414 "superblock": false, 00:15:41.414 "num_base_bdevs": 3, 00:15:41.414 "num_base_bdevs_discovered": 2, 00:15:41.414 "num_base_bdevs_operational": 3, 00:15:41.414 "base_bdevs_list": [ 00:15:41.414 { 00:15:41.414 "name": "BaseBdev1", 00:15:41.414 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:41.414 "is_configured": true, 00:15:41.414 "data_offset": 0, 00:15:41.414 "data_size": 65536 00:15:41.414 }, 00:15:41.414 { 00:15:41.414 "name": null, 00:15:41.414 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:41.414 "is_configured": false, 00:15:41.414 "data_offset": 0, 00:15:41.414 "data_size": 65536 00:15:41.414 }, 00:15:41.414 { 00:15:41.414 "name": "BaseBdev3", 00:15:41.414 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:41.414 "is_configured": true, 00:15:41.414 "data_offset": 0, 00:15:41.414 "data_size": 65536 00:15:41.414 } 00:15:41.414 ] 00:15:41.414 }' 00:15:41.414 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.414 10:23:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.978 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.978 10:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:41.978 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:41.978 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:42.235 [2024-07-15 10:23:19.383800] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.235 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:42.493 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.493 "name": "Existed_Raid", 00:15:42.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:42.493 "strip_size_kb": 64, 00:15:42.493 "state": "configuring", 00:15:42.493 "raid_level": "concat", 00:15:42.493 "superblock": false, 00:15:42.493 "num_base_bdevs": 3, 00:15:42.493 "num_base_bdevs_discovered": 1, 00:15:42.493 "num_base_bdevs_operational": 3, 00:15:42.493 "base_bdevs_list": [ 00:15:42.493 { 00:15:42.493 "name": "BaseBdev1", 00:15:42.493 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:42.493 "is_configured": true, 00:15:42.493 "data_offset": 0, 00:15:42.493 "data_size": 65536 00:15:42.493 }, 00:15:42.493 { 00:15:42.493 "name": null, 00:15:42.493 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:42.493 "is_configured": false, 00:15:42.493 "data_offset": 0, 00:15:42.493 "data_size": 65536 00:15:42.493 }, 00:15:42.493 { 00:15:42.493 "name": null, 00:15:42.493 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:42.493 "is_configured": false, 00:15:42.493 "data_offset": 0, 00:15:42.493 "data_size": 65536 00:15:42.493 } 00:15:42.493 ] 00:15:42.493 }' 00:15:42.493 10:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.493 10:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.426 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.426 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:43.426 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:43.426 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:43.684 [2024-07-15 10:23:20.723521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:43.684 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:43.941 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:43.941 "name": "Existed_Raid", 00:15:43.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:43.941 "strip_size_kb": 64, 00:15:43.941 "state": "configuring", 00:15:43.941 "raid_level": "concat", 00:15:43.941 "superblock": false, 00:15:43.941 "num_base_bdevs": 3, 00:15:43.941 "num_base_bdevs_discovered": 2, 00:15:43.941 "num_base_bdevs_operational": 3, 00:15:43.941 "base_bdevs_list": [ 00:15:43.941 { 00:15:43.941 "name": "BaseBdev1", 00:15:43.941 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:43.941 "is_configured": true, 00:15:43.941 "data_offset": 0, 00:15:43.941 "data_size": 65536 00:15:43.941 }, 00:15:43.941 { 00:15:43.941 "name": null, 00:15:43.941 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:43.941 "is_configured": false, 00:15:43.941 "data_offset": 0, 00:15:43.941 "data_size": 65536 00:15:43.941 }, 00:15:43.941 { 00:15:43.941 "name": "BaseBdev3", 00:15:43.941 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:43.941 "is_configured": true, 00:15:43.941 "data_offset": 0, 00:15:43.941 "data_size": 65536 00:15:43.941 } 00:15:43.941 ] 00:15:43.941 }' 00:15:43.941 10:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:43.941 10:23:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:44.505 10:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.505 10:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:44.762 10:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:44.762 10:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:45.020 [2024-07-15 10:23:22.063120] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.020 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.021 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.021 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.279 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.279 "name": "Existed_Raid", 00:15:45.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.279 "strip_size_kb": 64, 00:15:45.279 "state": "configuring", 00:15:45.279 "raid_level": "concat", 00:15:45.279 "superblock": false, 00:15:45.279 "num_base_bdevs": 3, 00:15:45.279 "num_base_bdevs_discovered": 1, 00:15:45.279 "num_base_bdevs_operational": 3, 00:15:45.279 "base_bdevs_list": [ 00:15:45.279 { 00:15:45.279 "name": null, 00:15:45.279 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:45.279 "is_configured": false, 00:15:45.279 "data_offset": 0, 00:15:45.279 "data_size": 65536 00:15:45.279 }, 00:15:45.279 { 00:15:45.279 "name": null, 00:15:45.279 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:45.279 "is_configured": false, 00:15:45.279 "data_offset": 0, 00:15:45.279 "data_size": 65536 00:15:45.279 }, 00:15:45.279 { 00:15:45.279 "name": "BaseBdev3", 00:15:45.279 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:45.279 "is_configured": true, 00:15:45.279 "data_offset": 0, 00:15:45.279 "data_size": 65536 00:15:45.279 } 00:15:45.279 ] 00:15:45.279 }' 00:15:45.279 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.279 10:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.843 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.843 10:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:46.102 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:46.102 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:46.366 [2024-07-15 10:23:23.407203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.366 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:46.634 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:46.634 "name": "Existed_Raid", 00:15:46.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:46.634 "strip_size_kb": 64, 00:15:46.634 "state": "configuring", 00:15:46.634 "raid_level": "concat", 00:15:46.634 "superblock": false, 00:15:46.634 "num_base_bdevs": 3, 00:15:46.634 "num_base_bdevs_discovered": 2, 00:15:46.634 "num_base_bdevs_operational": 3, 00:15:46.634 "base_bdevs_list": [ 00:15:46.634 { 00:15:46.634 "name": null, 00:15:46.634 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:46.634 "is_configured": false, 00:15:46.634 "data_offset": 0, 00:15:46.634 "data_size": 65536 00:15:46.634 }, 00:15:46.634 { 00:15:46.634 "name": "BaseBdev2", 00:15:46.634 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:46.634 "is_configured": true, 00:15:46.634 "data_offset": 0, 00:15:46.634 "data_size": 65536 00:15:46.634 }, 00:15:46.634 { 00:15:46.634 "name": "BaseBdev3", 00:15:46.634 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:46.634 "is_configured": true, 00:15:46.634 "data_offset": 0, 00:15:46.634 "data_size": 65536 00:15:46.634 } 00:15:46.634 ] 00:15:46.634 }' 00:15:46.634 10:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:46.634 10:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.197 10:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.197 10:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:47.454 10:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:47.454 10:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.454 10:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:47.712 10:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cf1b99e3-aac9-4787-83bd-bc56b50b62ba 00:15:47.970 [2024-07-15 10:23:24.996010] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:47.970 [2024-07-15 10:23:24.996053] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d04450 00:15:47.970 [2024-07-15 10:23:24.996062] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:47.970 [2024-07-15 10:23:24.996258] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d05ed0 00:15:47.970 [2024-07-15 10:23:24.996376] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d04450 00:15:47.970 [2024-07-15 10:23:24.996386] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d04450 00:15:47.970 [2024-07-15 10:23:24.996557] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:47.970 NewBaseBdev 00:15:47.970 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:47.970 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:47.970 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:47.970 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:47.970 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:47.970 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:47.970 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:48.228 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:48.485 [ 00:15:48.485 { 00:15:48.485 "name": "NewBaseBdev", 00:15:48.485 "aliases": [ 00:15:48.485 "cf1b99e3-aac9-4787-83bd-bc56b50b62ba" 00:15:48.485 ], 00:15:48.485 "product_name": "Malloc disk", 00:15:48.485 "block_size": 512, 00:15:48.485 "num_blocks": 65536, 00:15:48.485 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:48.485 "assigned_rate_limits": { 00:15:48.485 "rw_ios_per_sec": 0, 00:15:48.485 "rw_mbytes_per_sec": 0, 00:15:48.485 "r_mbytes_per_sec": 0, 00:15:48.485 "w_mbytes_per_sec": 0 00:15:48.485 }, 00:15:48.485 "claimed": true, 00:15:48.485 "claim_type": "exclusive_write", 00:15:48.486 "zoned": false, 00:15:48.486 "supported_io_types": { 00:15:48.486 "read": true, 00:15:48.486 "write": true, 00:15:48.486 "unmap": true, 00:15:48.486 "flush": true, 00:15:48.486 "reset": true, 00:15:48.486 "nvme_admin": false, 00:15:48.486 "nvme_io": false, 00:15:48.486 "nvme_io_md": false, 00:15:48.486 "write_zeroes": true, 00:15:48.486 "zcopy": true, 00:15:48.486 "get_zone_info": false, 00:15:48.486 "zone_management": false, 00:15:48.486 "zone_append": false, 00:15:48.486 "compare": false, 00:15:48.486 "compare_and_write": false, 00:15:48.486 "abort": true, 00:15:48.486 "seek_hole": false, 00:15:48.486 "seek_data": false, 00:15:48.486 "copy": true, 00:15:48.486 "nvme_iov_md": false 00:15:48.486 }, 00:15:48.486 "memory_domains": [ 00:15:48.486 { 00:15:48.486 "dma_device_id": "system", 00:15:48.486 "dma_device_type": 1 00:15:48.486 }, 00:15:48.486 { 00:15:48.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.486 "dma_device_type": 2 00:15:48.486 } 00:15:48.486 ], 00:15:48.486 "driver_specific": {} 00:15:48.486 } 00:15:48.486 ] 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.486 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.744 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.744 "name": "Existed_Raid", 00:15:48.744 "uuid": "dfb3297f-8f45-46c2-badd-8c4500201a78", 00:15:48.744 "strip_size_kb": 64, 00:15:48.744 "state": "online", 00:15:48.744 "raid_level": "concat", 00:15:48.744 "superblock": false, 00:15:48.744 "num_base_bdevs": 3, 00:15:48.744 "num_base_bdevs_discovered": 3, 00:15:48.744 "num_base_bdevs_operational": 3, 00:15:48.744 "base_bdevs_list": [ 00:15:48.744 { 00:15:48.744 "name": "NewBaseBdev", 00:15:48.744 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:48.744 "is_configured": true, 00:15:48.744 "data_offset": 0, 00:15:48.744 "data_size": 65536 00:15:48.744 }, 00:15:48.744 { 00:15:48.744 "name": "BaseBdev2", 00:15:48.744 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:48.744 "is_configured": true, 00:15:48.744 "data_offset": 0, 00:15:48.744 "data_size": 65536 00:15:48.744 }, 00:15:48.744 { 00:15:48.744 "name": "BaseBdev3", 00:15:48.744 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:48.744 "is_configured": true, 00:15:48.744 "data_offset": 0, 00:15:48.744 "data_size": 65536 00:15:48.744 } 00:15:48.744 ] 00:15:48.744 }' 00:15:48.744 10:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.744 10:23:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:49.310 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:49.568 [2024-07-15 10:23:26.564497] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:49.568 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:49.568 "name": "Existed_Raid", 00:15:49.568 "aliases": [ 00:15:49.569 "dfb3297f-8f45-46c2-badd-8c4500201a78" 00:15:49.569 ], 00:15:49.569 "product_name": "Raid Volume", 00:15:49.569 "block_size": 512, 00:15:49.569 "num_blocks": 196608, 00:15:49.569 "uuid": "dfb3297f-8f45-46c2-badd-8c4500201a78", 00:15:49.569 "assigned_rate_limits": { 00:15:49.569 "rw_ios_per_sec": 0, 00:15:49.569 "rw_mbytes_per_sec": 0, 00:15:49.569 "r_mbytes_per_sec": 0, 00:15:49.569 "w_mbytes_per_sec": 0 00:15:49.569 }, 00:15:49.569 "claimed": false, 00:15:49.569 "zoned": false, 00:15:49.569 "supported_io_types": { 00:15:49.569 "read": true, 00:15:49.569 "write": true, 00:15:49.569 "unmap": true, 00:15:49.569 "flush": true, 00:15:49.569 "reset": true, 00:15:49.569 "nvme_admin": false, 00:15:49.569 "nvme_io": false, 00:15:49.569 "nvme_io_md": false, 00:15:49.569 "write_zeroes": true, 00:15:49.569 "zcopy": false, 00:15:49.569 "get_zone_info": false, 00:15:49.569 "zone_management": false, 00:15:49.569 "zone_append": false, 00:15:49.569 "compare": false, 00:15:49.569 "compare_and_write": false, 00:15:49.569 "abort": false, 00:15:49.569 "seek_hole": false, 00:15:49.569 "seek_data": false, 00:15:49.569 "copy": false, 00:15:49.569 "nvme_iov_md": false 00:15:49.569 }, 00:15:49.569 "memory_domains": [ 00:15:49.569 { 00:15:49.569 "dma_device_id": "system", 00:15:49.569 "dma_device_type": 1 00:15:49.569 }, 00:15:49.569 { 00:15:49.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.569 "dma_device_type": 2 00:15:49.569 }, 00:15:49.569 { 00:15:49.569 "dma_device_id": "system", 00:15:49.569 "dma_device_type": 1 00:15:49.569 }, 00:15:49.569 { 00:15:49.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.569 "dma_device_type": 2 00:15:49.569 }, 00:15:49.569 { 00:15:49.569 "dma_device_id": "system", 00:15:49.569 "dma_device_type": 1 00:15:49.569 }, 00:15:49.569 { 00:15:49.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.569 "dma_device_type": 2 00:15:49.569 } 00:15:49.569 ], 00:15:49.569 "driver_specific": { 00:15:49.569 "raid": { 00:15:49.569 "uuid": "dfb3297f-8f45-46c2-badd-8c4500201a78", 00:15:49.569 "strip_size_kb": 64, 00:15:49.569 "state": "online", 00:15:49.569 "raid_level": "concat", 00:15:49.569 "superblock": false, 00:15:49.569 "num_base_bdevs": 3, 00:15:49.569 "num_base_bdevs_discovered": 3, 00:15:49.569 "num_base_bdevs_operational": 3, 00:15:49.569 "base_bdevs_list": [ 00:15:49.569 { 00:15:49.569 "name": "NewBaseBdev", 00:15:49.569 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:49.569 "is_configured": true, 00:15:49.569 "data_offset": 0, 00:15:49.569 "data_size": 65536 00:15:49.569 }, 00:15:49.569 { 00:15:49.569 "name": "BaseBdev2", 00:15:49.569 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:49.569 "is_configured": true, 00:15:49.569 "data_offset": 0, 00:15:49.569 "data_size": 65536 00:15:49.569 }, 00:15:49.569 { 00:15:49.569 "name": "BaseBdev3", 00:15:49.569 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:49.569 "is_configured": true, 00:15:49.569 "data_offset": 0, 00:15:49.569 "data_size": 65536 00:15:49.569 } 00:15:49.569 ] 00:15:49.569 } 00:15:49.569 } 00:15:49.569 }' 00:15:49.569 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:49.569 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:49.569 BaseBdev2 00:15:49.569 BaseBdev3' 00:15:49.569 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:49.569 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:49.569 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:49.827 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:49.827 "name": "NewBaseBdev", 00:15:49.827 "aliases": [ 00:15:49.827 "cf1b99e3-aac9-4787-83bd-bc56b50b62ba" 00:15:49.827 ], 00:15:49.827 "product_name": "Malloc disk", 00:15:49.827 "block_size": 512, 00:15:49.827 "num_blocks": 65536, 00:15:49.827 "uuid": "cf1b99e3-aac9-4787-83bd-bc56b50b62ba", 00:15:49.827 "assigned_rate_limits": { 00:15:49.827 "rw_ios_per_sec": 0, 00:15:49.827 "rw_mbytes_per_sec": 0, 00:15:49.827 "r_mbytes_per_sec": 0, 00:15:49.827 "w_mbytes_per_sec": 0 00:15:49.827 }, 00:15:49.827 "claimed": true, 00:15:49.827 "claim_type": "exclusive_write", 00:15:49.827 "zoned": false, 00:15:49.827 "supported_io_types": { 00:15:49.827 "read": true, 00:15:49.827 "write": true, 00:15:49.827 "unmap": true, 00:15:49.827 "flush": true, 00:15:49.827 "reset": true, 00:15:49.827 "nvme_admin": false, 00:15:49.827 "nvme_io": false, 00:15:49.827 "nvme_io_md": false, 00:15:49.827 "write_zeroes": true, 00:15:49.827 "zcopy": true, 00:15:49.827 "get_zone_info": false, 00:15:49.827 "zone_management": false, 00:15:49.827 "zone_append": false, 00:15:49.827 "compare": false, 00:15:49.827 "compare_and_write": false, 00:15:49.827 "abort": true, 00:15:49.827 "seek_hole": false, 00:15:49.827 "seek_data": false, 00:15:49.827 "copy": true, 00:15:49.827 "nvme_iov_md": false 00:15:49.827 }, 00:15:49.827 "memory_domains": [ 00:15:49.827 { 00:15:49.827 "dma_device_id": "system", 00:15:49.827 "dma_device_type": 1 00:15:49.827 }, 00:15:49.827 { 00:15:49.827 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.827 "dma_device_type": 2 00:15:49.827 } 00:15:49.827 ], 00:15:49.827 "driver_specific": {} 00:15:49.827 }' 00:15:49.827 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:49.827 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:49.827 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:49.827 10:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:49.827 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.085 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:50.086 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:50.344 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:50.344 "name": "BaseBdev2", 00:15:50.344 "aliases": [ 00:15:50.344 "23083494-d614-4837-af20-744b061bdcb9" 00:15:50.344 ], 00:15:50.344 "product_name": "Malloc disk", 00:15:50.344 "block_size": 512, 00:15:50.344 "num_blocks": 65536, 00:15:50.344 "uuid": "23083494-d614-4837-af20-744b061bdcb9", 00:15:50.344 "assigned_rate_limits": { 00:15:50.344 "rw_ios_per_sec": 0, 00:15:50.344 "rw_mbytes_per_sec": 0, 00:15:50.344 "r_mbytes_per_sec": 0, 00:15:50.344 "w_mbytes_per_sec": 0 00:15:50.344 }, 00:15:50.344 "claimed": true, 00:15:50.344 "claim_type": "exclusive_write", 00:15:50.344 "zoned": false, 00:15:50.344 "supported_io_types": { 00:15:50.344 "read": true, 00:15:50.344 "write": true, 00:15:50.344 "unmap": true, 00:15:50.344 "flush": true, 00:15:50.344 "reset": true, 00:15:50.344 "nvme_admin": false, 00:15:50.344 "nvme_io": false, 00:15:50.344 "nvme_io_md": false, 00:15:50.344 "write_zeroes": true, 00:15:50.344 "zcopy": true, 00:15:50.344 "get_zone_info": false, 00:15:50.344 "zone_management": false, 00:15:50.344 "zone_append": false, 00:15:50.344 "compare": false, 00:15:50.344 "compare_and_write": false, 00:15:50.344 "abort": true, 00:15:50.344 "seek_hole": false, 00:15:50.344 "seek_data": false, 00:15:50.344 "copy": true, 00:15:50.344 "nvme_iov_md": false 00:15:50.344 }, 00:15:50.344 "memory_domains": [ 00:15:50.344 { 00:15:50.344 "dma_device_id": "system", 00:15:50.344 "dma_device_type": 1 00:15:50.344 }, 00:15:50.344 { 00:15:50.344 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.344 "dma_device_type": 2 00:15:50.344 } 00:15:50.344 ], 00:15:50.344 "driver_specific": {} 00:15:50.344 }' 00:15:50.344 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.344 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.601 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.858 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:50.858 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:50.858 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:50.858 10:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.113 "name": "BaseBdev3", 00:15:51.113 "aliases": [ 00:15:51.113 "6d7e15ed-aa27-43fc-b95f-747660e053ec" 00:15:51.113 ], 00:15:51.113 "product_name": "Malloc disk", 00:15:51.113 "block_size": 512, 00:15:51.113 "num_blocks": 65536, 00:15:51.113 "uuid": "6d7e15ed-aa27-43fc-b95f-747660e053ec", 00:15:51.113 "assigned_rate_limits": { 00:15:51.113 "rw_ios_per_sec": 0, 00:15:51.113 "rw_mbytes_per_sec": 0, 00:15:51.113 "r_mbytes_per_sec": 0, 00:15:51.113 "w_mbytes_per_sec": 0 00:15:51.113 }, 00:15:51.113 "claimed": true, 00:15:51.113 "claim_type": "exclusive_write", 00:15:51.113 "zoned": false, 00:15:51.113 "supported_io_types": { 00:15:51.113 "read": true, 00:15:51.113 "write": true, 00:15:51.113 "unmap": true, 00:15:51.113 "flush": true, 00:15:51.113 "reset": true, 00:15:51.113 "nvme_admin": false, 00:15:51.113 "nvme_io": false, 00:15:51.113 "nvme_io_md": false, 00:15:51.113 "write_zeroes": true, 00:15:51.113 "zcopy": true, 00:15:51.113 "get_zone_info": false, 00:15:51.113 "zone_management": false, 00:15:51.113 "zone_append": false, 00:15:51.113 "compare": false, 00:15:51.113 "compare_and_write": false, 00:15:51.113 "abort": true, 00:15:51.113 "seek_hole": false, 00:15:51.113 "seek_data": false, 00:15:51.113 "copy": true, 00:15:51.113 "nvme_iov_md": false 00:15:51.113 }, 00:15:51.113 "memory_domains": [ 00:15:51.113 { 00:15:51.113 "dma_device_id": "system", 00:15:51.113 "dma_device_type": 1 00:15:51.113 }, 00:15:51.113 { 00:15:51.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.113 "dma_device_type": 2 00:15:51.113 } 00:15:51.113 ], 00:15:51.113 "driver_specific": {} 00:15:51.113 }' 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.113 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.382 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.383 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.383 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.383 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.383 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:51.641 [2024-07-15 10:23:28.673821] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:51.641 [2024-07-15 10:23:28.673851] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:51.641 [2024-07-15 10:23:28.673910] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:51.641 [2024-07-15 10:23:28.673975] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:51.641 [2024-07-15 10:23:28.673988] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d04450 name Existed_Raid, state offline 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 507894 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 507894 ']' 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 507894 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 507894 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 507894' 00:15:51.641 killing process with pid 507894 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 507894 00:15:51.641 [2024-07-15 10:23:28.740048] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:51.641 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 507894 00:15:51.641 [2024-07-15 10:23:28.771086] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:51.898 10:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:51.898 00:15:51.898 real 0m28.331s 00:15:51.898 user 0m51.896s 00:15:51.898 sys 0m5.120s 00:15:51.898 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:51.898 10:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.898 ************************************ 00:15:51.898 END TEST raid_state_function_test 00:15:51.898 ************************************ 00:15:51.898 10:23:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:51.898 10:23:29 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:51.898 10:23:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:51.898 10:23:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:51.898 10:23:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:51.898 ************************************ 00:15:51.898 START TEST raid_state_function_test_sb 00:15:51.898 ************************************ 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=512196 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 512196' 00:15:51.898 Process raid pid: 512196 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 512196 /var/tmp/spdk-raid.sock 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 512196 ']' 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:51.898 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:51.898 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:52.155 [2024-07-15 10:23:29.123299] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:15:52.155 [2024-07-15 10:23:29.123344] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:52.155 [2024-07-15 10:23:29.235032] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:52.155 [2024-07-15 10:23:29.339677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:52.412 [2024-07-15 10:23:29.399403] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.412 [2024-07-15 10:23:29.399432] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.412 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:52.412 10:23:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:52.412 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:52.668 [2024-07-15 10:23:29.832851] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:52.669 [2024-07-15 10:23:29.832894] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:52.669 [2024-07-15 10:23:29.832906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:52.669 [2024-07-15 10:23:29.832917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:52.669 [2024-07-15 10:23:29.832934] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:52.669 [2024-07-15 10:23:29.832945] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.669 10:23:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.925 10:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.925 "name": "Existed_Raid", 00:15:52.925 "uuid": "d9a20531-4de3-47fb-9473-4b1c69e0c4ee", 00:15:52.925 "strip_size_kb": 64, 00:15:52.925 "state": "configuring", 00:15:52.925 "raid_level": "concat", 00:15:52.925 "superblock": true, 00:15:52.925 "num_base_bdevs": 3, 00:15:52.925 "num_base_bdevs_discovered": 0, 00:15:52.925 "num_base_bdevs_operational": 3, 00:15:52.925 "base_bdevs_list": [ 00:15:52.925 { 00:15:52.925 "name": "BaseBdev1", 00:15:52.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.925 "is_configured": false, 00:15:52.925 "data_offset": 0, 00:15:52.925 "data_size": 0 00:15:52.925 }, 00:15:52.925 { 00:15:52.925 "name": "BaseBdev2", 00:15:52.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.925 "is_configured": false, 00:15:52.925 "data_offset": 0, 00:15:52.925 "data_size": 0 00:15:52.925 }, 00:15:52.925 { 00:15:52.925 "name": "BaseBdev3", 00:15:52.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.925 "is_configured": false, 00:15:52.925 "data_offset": 0, 00:15:52.925 "data_size": 0 00:15:52.925 } 00:15:52.925 ] 00:15:52.925 }' 00:15:52.925 10:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.925 10:23:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:53.488 10:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:53.745 [2024-07-15 10:23:30.907547] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:53.745 [2024-07-15 10:23:30.907581] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x810a80 name Existed_Raid, state configuring 00:15:53.745 10:23:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:54.002 [2024-07-15 10:23:31.152229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:54.002 [2024-07-15 10:23:31.152263] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:54.002 [2024-07-15 10:23:31.152274] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:54.002 [2024-07-15 10:23:31.152285] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:54.002 [2024-07-15 10:23:31.152295] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:54.002 [2024-07-15 10:23:31.152306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:54.002 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:54.259 [2024-07-15 10:23:31.407971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:54.259 BaseBdev1 00:15:54.259 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:54.259 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:54.259 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:54.259 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:54.259 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:54.260 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:54.260 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:54.516 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:54.774 [ 00:15:54.774 { 00:15:54.774 "name": "BaseBdev1", 00:15:54.774 "aliases": [ 00:15:54.774 "0a41c11b-9a90-4646-9f84-a716999b059e" 00:15:54.774 ], 00:15:54.774 "product_name": "Malloc disk", 00:15:54.774 "block_size": 512, 00:15:54.774 "num_blocks": 65536, 00:15:54.774 "uuid": "0a41c11b-9a90-4646-9f84-a716999b059e", 00:15:54.774 "assigned_rate_limits": { 00:15:54.774 "rw_ios_per_sec": 0, 00:15:54.774 "rw_mbytes_per_sec": 0, 00:15:54.774 "r_mbytes_per_sec": 0, 00:15:54.774 "w_mbytes_per_sec": 0 00:15:54.774 }, 00:15:54.774 "claimed": true, 00:15:54.774 "claim_type": "exclusive_write", 00:15:54.774 "zoned": false, 00:15:54.774 "supported_io_types": { 00:15:54.774 "read": true, 00:15:54.774 "write": true, 00:15:54.774 "unmap": true, 00:15:54.774 "flush": true, 00:15:54.774 "reset": true, 00:15:54.774 "nvme_admin": false, 00:15:54.774 "nvme_io": false, 00:15:54.774 "nvme_io_md": false, 00:15:54.774 "write_zeroes": true, 00:15:54.774 "zcopy": true, 00:15:54.774 "get_zone_info": false, 00:15:54.774 "zone_management": false, 00:15:54.774 "zone_append": false, 00:15:54.774 "compare": false, 00:15:54.774 "compare_and_write": false, 00:15:54.774 "abort": true, 00:15:54.774 "seek_hole": false, 00:15:54.774 "seek_data": false, 00:15:54.774 "copy": true, 00:15:54.774 "nvme_iov_md": false 00:15:54.774 }, 00:15:54.774 "memory_domains": [ 00:15:54.774 { 00:15:54.774 "dma_device_id": "system", 00:15:54.774 "dma_device_type": 1 00:15:54.774 }, 00:15:54.774 { 00:15:54.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.774 "dma_device_type": 2 00:15:54.774 } 00:15:54.774 ], 00:15:54.774 "driver_specific": {} 00:15:54.774 } 00:15:54.774 ] 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.774 10:23:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.032 10:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.032 "name": "Existed_Raid", 00:15:55.032 "uuid": "32da0be0-40f4-4d7d-9ee1-24389a6db33c", 00:15:55.032 "strip_size_kb": 64, 00:15:55.032 "state": "configuring", 00:15:55.032 "raid_level": "concat", 00:15:55.032 "superblock": true, 00:15:55.032 "num_base_bdevs": 3, 00:15:55.032 "num_base_bdevs_discovered": 1, 00:15:55.032 "num_base_bdevs_operational": 3, 00:15:55.032 "base_bdevs_list": [ 00:15:55.032 { 00:15:55.032 "name": "BaseBdev1", 00:15:55.032 "uuid": "0a41c11b-9a90-4646-9f84-a716999b059e", 00:15:55.032 "is_configured": true, 00:15:55.032 "data_offset": 2048, 00:15:55.032 "data_size": 63488 00:15:55.032 }, 00:15:55.032 { 00:15:55.032 "name": "BaseBdev2", 00:15:55.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.032 "is_configured": false, 00:15:55.032 "data_offset": 0, 00:15:55.032 "data_size": 0 00:15:55.032 }, 00:15:55.032 { 00:15:55.032 "name": "BaseBdev3", 00:15:55.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.032 "is_configured": false, 00:15:55.032 "data_offset": 0, 00:15:55.032 "data_size": 0 00:15:55.032 } 00:15:55.032 ] 00:15:55.032 }' 00:15:55.032 10:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.032 10:23:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:55.598 10:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:55.857 [2024-07-15 10:23:32.972129] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:55.857 [2024-07-15 10:23:32.972171] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x810310 name Existed_Raid, state configuring 00:15:55.857 10:23:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:56.115 [2024-07-15 10:23:33.212814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:56.115 [2024-07-15 10:23:33.214332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:56.115 [2024-07-15 10:23:33.214364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:56.115 [2024-07-15 10:23:33.214374] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:56.115 [2024-07-15 10:23:33.214386] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.115 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:56.373 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.373 "name": "Existed_Raid", 00:15:56.373 "uuid": "158c0b3d-b06a-4d81-b45d-f6bae473d483", 00:15:56.373 "strip_size_kb": 64, 00:15:56.373 "state": "configuring", 00:15:56.373 "raid_level": "concat", 00:15:56.373 "superblock": true, 00:15:56.373 "num_base_bdevs": 3, 00:15:56.373 "num_base_bdevs_discovered": 1, 00:15:56.373 "num_base_bdevs_operational": 3, 00:15:56.373 "base_bdevs_list": [ 00:15:56.373 { 00:15:56.373 "name": "BaseBdev1", 00:15:56.373 "uuid": "0a41c11b-9a90-4646-9f84-a716999b059e", 00:15:56.373 "is_configured": true, 00:15:56.373 "data_offset": 2048, 00:15:56.373 "data_size": 63488 00:15:56.373 }, 00:15:56.373 { 00:15:56.373 "name": "BaseBdev2", 00:15:56.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:56.373 "is_configured": false, 00:15:56.373 "data_offset": 0, 00:15:56.373 "data_size": 0 00:15:56.373 }, 00:15:56.373 { 00:15:56.373 "name": "BaseBdev3", 00:15:56.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:56.373 "is_configured": false, 00:15:56.373 "data_offset": 0, 00:15:56.373 "data_size": 0 00:15:56.373 } 00:15:56.373 ] 00:15:56.373 }' 00:15:56.373 10:23:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.373 10:23:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.937 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:57.195 [2024-07-15 10:23:34.230904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:57.195 BaseBdev2 00:15:57.195 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:57.195 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:57.195 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:57.195 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:57.195 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:57.195 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:57.195 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:57.453 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:57.712 [ 00:15:57.712 { 00:15:57.712 "name": "BaseBdev2", 00:15:57.712 "aliases": [ 00:15:57.712 "8678d897-bec2-4520-b541-e5cbb3af7513" 00:15:57.712 ], 00:15:57.712 "product_name": "Malloc disk", 00:15:57.712 "block_size": 512, 00:15:57.712 "num_blocks": 65536, 00:15:57.712 "uuid": "8678d897-bec2-4520-b541-e5cbb3af7513", 00:15:57.712 "assigned_rate_limits": { 00:15:57.712 "rw_ios_per_sec": 0, 00:15:57.712 "rw_mbytes_per_sec": 0, 00:15:57.712 "r_mbytes_per_sec": 0, 00:15:57.712 "w_mbytes_per_sec": 0 00:15:57.712 }, 00:15:57.712 "claimed": true, 00:15:57.712 "claim_type": "exclusive_write", 00:15:57.712 "zoned": false, 00:15:57.712 "supported_io_types": { 00:15:57.712 "read": true, 00:15:57.712 "write": true, 00:15:57.712 "unmap": true, 00:15:57.712 "flush": true, 00:15:57.712 "reset": true, 00:15:57.712 "nvme_admin": false, 00:15:57.712 "nvme_io": false, 00:15:57.712 "nvme_io_md": false, 00:15:57.712 "write_zeroes": true, 00:15:57.712 "zcopy": true, 00:15:57.712 "get_zone_info": false, 00:15:57.712 "zone_management": false, 00:15:57.712 "zone_append": false, 00:15:57.712 "compare": false, 00:15:57.712 "compare_and_write": false, 00:15:57.712 "abort": true, 00:15:57.712 "seek_hole": false, 00:15:57.712 "seek_data": false, 00:15:57.712 "copy": true, 00:15:57.712 "nvme_iov_md": false 00:15:57.712 }, 00:15:57.712 "memory_domains": [ 00:15:57.712 { 00:15:57.712 "dma_device_id": "system", 00:15:57.712 "dma_device_type": 1 00:15:57.712 }, 00:15:57.712 { 00:15:57.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.712 "dma_device_type": 2 00:15:57.712 } 00:15:57.712 ], 00:15:57.712 "driver_specific": {} 00:15:57.712 } 00:15:57.712 ] 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.712 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.971 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.971 "name": "Existed_Raid", 00:15:57.971 "uuid": "158c0b3d-b06a-4d81-b45d-f6bae473d483", 00:15:57.971 "strip_size_kb": 64, 00:15:57.971 "state": "configuring", 00:15:57.971 "raid_level": "concat", 00:15:57.971 "superblock": true, 00:15:57.971 "num_base_bdevs": 3, 00:15:57.971 "num_base_bdevs_discovered": 2, 00:15:57.971 "num_base_bdevs_operational": 3, 00:15:57.971 "base_bdevs_list": [ 00:15:57.971 { 00:15:57.971 "name": "BaseBdev1", 00:15:57.971 "uuid": "0a41c11b-9a90-4646-9f84-a716999b059e", 00:15:57.971 "is_configured": true, 00:15:57.971 "data_offset": 2048, 00:15:57.971 "data_size": 63488 00:15:57.971 }, 00:15:57.971 { 00:15:57.971 "name": "BaseBdev2", 00:15:57.971 "uuid": "8678d897-bec2-4520-b541-e5cbb3af7513", 00:15:57.971 "is_configured": true, 00:15:57.971 "data_offset": 2048, 00:15:57.971 "data_size": 63488 00:15:57.971 }, 00:15:57.971 { 00:15:57.971 "name": "BaseBdev3", 00:15:57.971 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.971 "is_configured": false, 00:15:57.971 "data_offset": 0, 00:15:57.971 "data_size": 0 00:15:57.971 } 00:15:57.971 ] 00:15:57.971 }' 00:15:57.971 10:23:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.971 10:23:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:58.537 [2024-07-15 10:23:35.690210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:58.537 [2024-07-15 10:23:35.690368] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x811400 00:15:58.537 [2024-07-15 10:23:35.690382] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:58.537 [2024-07-15 10:23:35.690555] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x810ef0 00:15:58.537 [2024-07-15 10:23:35.690665] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x811400 00:15:58.537 [2024-07-15 10:23:35.690675] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x811400 00:15:58.537 [2024-07-15 10:23:35.690762] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:58.537 BaseBdev3 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:58.537 10:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.796 10:23:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:59.054 [ 00:15:59.054 { 00:15:59.054 "name": "BaseBdev3", 00:15:59.054 "aliases": [ 00:15:59.054 "17219f4c-6d43-4165-b361-13ac6d2608cd" 00:15:59.054 ], 00:15:59.054 "product_name": "Malloc disk", 00:15:59.054 "block_size": 512, 00:15:59.054 "num_blocks": 65536, 00:15:59.054 "uuid": "17219f4c-6d43-4165-b361-13ac6d2608cd", 00:15:59.054 "assigned_rate_limits": { 00:15:59.054 "rw_ios_per_sec": 0, 00:15:59.054 "rw_mbytes_per_sec": 0, 00:15:59.054 "r_mbytes_per_sec": 0, 00:15:59.054 "w_mbytes_per_sec": 0 00:15:59.054 }, 00:15:59.054 "claimed": true, 00:15:59.054 "claim_type": "exclusive_write", 00:15:59.054 "zoned": false, 00:15:59.054 "supported_io_types": { 00:15:59.054 "read": true, 00:15:59.054 "write": true, 00:15:59.054 "unmap": true, 00:15:59.054 "flush": true, 00:15:59.054 "reset": true, 00:15:59.054 "nvme_admin": false, 00:15:59.054 "nvme_io": false, 00:15:59.054 "nvme_io_md": false, 00:15:59.054 "write_zeroes": true, 00:15:59.054 "zcopy": true, 00:15:59.054 "get_zone_info": false, 00:15:59.054 "zone_management": false, 00:15:59.054 "zone_append": false, 00:15:59.054 "compare": false, 00:15:59.054 "compare_and_write": false, 00:15:59.054 "abort": true, 00:15:59.054 "seek_hole": false, 00:15:59.054 "seek_data": false, 00:15:59.054 "copy": true, 00:15:59.054 "nvme_iov_md": false 00:15:59.054 }, 00:15:59.054 "memory_domains": [ 00:15:59.054 { 00:15:59.054 "dma_device_id": "system", 00:15:59.054 "dma_device_type": 1 00:15:59.054 }, 00:15:59.054 { 00:15:59.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.054 "dma_device_type": 2 00:15:59.054 } 00:15:59.054 ], 00:15:59.054 "driver_specific": {} 00:15:59.054 } 00:15:59.054 ] 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.054 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.313 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.313 "name": "Existed_Raid", 00:15:59.313 "uuid": "158c0b3d-b06a-4d81-b45d-f6bae473d483", 00:15:59.313 "strip_size_kb": 64, 00:15:59.313 "state": "online", 00:15:59.313 "raid_level": "concat", 00:15:59.313 "superblock": true, 00:15:59.313 "num_base_bdevs": 3, 00:15:59.313 "num_base_bdevs_discovered": 3, 00:15:59.313 "num_base_bdevs_operational": 3, 00:15:59.313 "base_bdevs_list": [ 00:15:59.313 { 00:15:59.313 "name": "BaseBdev1", 00:15:59.313 "uuid": "0a41c11b-9a90-4646-9f84-a716999b059e", 00:15:59.313 "is_configured": true, 00:15:59.313 "data_offset": 2048, 00:15:59.313 "data_size": 63488 00:15:59.313 }, 00:15:59.313 { 00:15:59.313 "name": "BaseBdev2", 00:15:59.313 "uuid": "8678d897-bec2-4520-b541-e5cbb3af7513", 00:15:59.313 "is_configured": true, 00:15:59.313 "data_offset": 2048, 00:15:59.313 "data_size": 63488 00:15:59.313 }, 00:15:59.313 { 00:15:59.313 "name": "BaseBdev3", 00:15:59.313 "uuid": "17219f4c-6d43-4165-b361-13ac6d2608cd", 00:15:59.313 "is_configured": true, 00:15:59.313 "data_offset": 2048, 00:15:59.313 "data_size": 63488 00:15:59.313 } 00:15:59.313 ] 00:15:59.313 }' 00:15:59.313 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.313 10:23:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:59.879 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:59.879 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:59.879 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:59.879 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:59.879 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:59.879 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:59.879 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:59.880 10:23:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:59.880 [2024-07-15 10:23:37.050142] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:59.880 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:59.880 "name": "Existed_Raid", 00:15:59.880 "aliases": [ 00:15:59.880 "158c0b3d-b06a-4d81-b45d-f6bae473d483" 00:15:59.880 ], 00:15:59.880 "product_name": "Raid Volume", 00:15:59.880 "block_size": 512, 00:15:59.880 "num_blocks": 190464, 00:15:59.880 "uuid": "158c0b3d-b06a-4d81-b45d-f6bae473d483", 00:15:59.880 "assigned_rate_limits": { 00:15:59.880 "rw_ios_per_sec": 0, 00:15:59.880 "rw_mbytes_per_sec": 0, 00:15:59.880 "r_mbytes_per_sec": 0, 00:15:59.880 "w_mbytes_per_sec": 0 00:15:59.880 }, 00:15:59.880 "claimed": false, 00:15:59.880 "zoned": false, 00:15:59.880 "supported_io_types": { 00:15:59.880 "read": true, 00:15:59.880 "write": true, 00:15:59.880 "unmap": true, 00:15:59.880 "flush": true, 00:15:59.880 "reset": true, 00:15:59.880 "nvme_admin": false, 00:15:59.880 "nvme_io": false, 00:15:59.880 "nvme_io_md": false, 00:15:59.880 "write_zeroes": true, 00:15:59.880 "zcopy": false, 00:15:59.880 "get_zone_info": false, 00:15:59.880 "zone_management": false, 00:15:59.880 "zone_append": false, 00:15:59.880 "compare": false, 00:15:59.880 "compare_and_write": false, 00:15:59.880 "abort": false, 00:15:59.880 "seek_hole": false, 00:15:59.880 "seek_data": false, 00:15:59.880 "copy": false, 00:15:59.880 "nvme_iov_md": false 00:15:59.880 }, 00:15:59.880 "memory_domains": [ 00:15:59.880 { 00:15:59.880 "dma_device_id": "system", 00:15:59.880 "dma_device_type": 1 00:15:59.880 }, 00:15:59.880 { 00:15:59.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.880 "dma_device_type": 2 00:15:59.880 }, 00:15:59.880 { 00:15:59.880 "dma_device_id": "system", 00:15:59.880 "dma_device_type": 1 00:15:59.880 }, 00:15:59.880 { 00:15:59.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.880 "dma_device_type": 2 00:15:59.880 }, 00:15:59.880 { 00:15:59.880 "dma_device_id": "system", 00:15:59.880 "dma_device_type": 1 00:15:59.880 }, 00:15:59.880 { 00:15:59.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.880 "dma_device_type": 2 00:15:59.880 } 00:15:59.880 ], 00:15:59.880 "driver_specific": { 00:15:59.880 "raid": { 00:15:59.880 "uuid": "158c0b3d-b06a-4d81-b45d-f6bae473d483", 00:15:59.880 "strip_size_kb": 64, 00:15:59.880 "state": "online", 00:15:59.880 "raid_level": "concat", 00:15:59.880 "superblock": true, 00:15:59.880 "num_base_bdevs": 3, 00:15:59.880 "num_base_bdevs_discovered": 3, 00:15:59.880 "num_base_bdevs_operational": 3, 00:15:59.880 "base_bdevs_list": [ 00:15:59.880 { 00:15:59.880 "name": "BaseBdev1", 00:15:59.880 "uuid": "0a41c11b-9a90-4646-9f84-a716999b059e", 00:15:59.880 "is_configured": true, 00:15:59.880 "data_offset": 2048, 00:15:59.880 "data_size": 63488 00:15:59.880 }, 00:15:59.880 { 00:15:59.880 "name": "BaseBdev2", 00:15:59.880 "uuid": "8678d897-bec2-4520-b541-e5cbb3af7513", 00:15:59.880 "is_configured": true, 00:15:59.880 "data_offset": 2048, 00:15:59.880 "data_size": 63488 00:15:59.880 }, 00:15:59.880 { 00:15:59.880 "name": "BaseBdev3", 00:15:59.880 "uuid": "17219f4c-6d43-4165-b361-13ac6d2608cd", 00:15:59.880 "is_configured": true, 00:15:59.880 "data_offset": 2048, 00:15:59.880 "data_size": 63488 00:15:59.880 } 00:15:59.880 ] 00:15:59.880 } 00:15:59.880 } 00:15:59.880 }' 00:15:59.880 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:00.138 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:00.138 BaseBdev2 00:16:00.138 BaseBdev3' 00:16:00.138 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:00.138 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:00.138 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:00.138 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:00.138 "name": "BaseBdev1", 00:16:00.138 "aliases": [ 00:16:00.138 "0a41c11b-9a90-4646-9f84-a716999b059e" 00:16:00.138 ], 00:16:00.138 "product_name": "Malloc disk", 00:16:00.138 "block_size": 512, 00:16:00.138 "num_blocks": 65536, 00:16:00.138 "uuid": "0a41c11b-9a90-4646-9f84-a716999b059e", 00:16:00.138 "assigned_rate_limits": { 00:16:00.138 "rw_ios_per_sec": 0, 00:16:00.138 "rw_mbytes_per_sec": 0, 00:16:00.138 "r_mbytes_per_sec": 0, 00:16:00.138 "w_mbytes_per_sec": 0 00:16:00.138 }, 00:16:00.138 "claimed": true, 00:16:00.138 "claim_type": "exclusive_write", 00:16:00.138 "zoned": false, 00:16:00.138 "supported_io_types": { 00:16:00.138 "read": true, 00:16:00.138 "write": true, 00:16:00.138 "unmap": true, 00:16:00.138 "flush": true, 00:16:00.138 "reset": true, 00:16:00.138 "nvme_admin": false, 00:16:00.138 "nvme_io": false, 00:16:00.139 "nvme_io_md": false, 00:16:00.139 "write_zeroes": true, 00:16:00.139 "zcopy": true, 00:16:00.139 "get_zone_info": false, 00:16:00.139 "zone_management": false, 00:16:00.139 "zone_append": false, 00:16:00.139 "compare": false, 00:16:00.139 "compare_and_write": false, 00:16:00.139 "abort": true, 00:16:00.139 "seek_hole": false, 00:16:00.139 "seek_data": false, 00:16:00.139 "copy": true, 00:16:00.139 "nvme_iov_md": false 00:16:00.139 }, 00:16:00.139 "memory_domains": [ 00:16:00.139 { 00:16:00.139 "dma_device_id": "system", 00:16:00.139 "dma_device_type": 1 00:16:00.139 }, 00:16:00.139 { 00:16:00.139 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.139 "dma_device_type": 2 00:16:00.139 } 00:16:00.139 ], 00:16:00.139 "driver_specific": {} 00:16:00.139 }' 00:16:00.139 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.139 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.434 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:00.692 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:00.692 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:00.692 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:00.692 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:00.692 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:00.692 "name": "BaseBdev2", 00:16:00.692 "aliases": [ 00:16:00.692 "8678d897-bec2-4520-b541-e5cbb3af7513" 00:16:00.692 ], 00:16:00.692 "product_name": "Malloc disk", 00:16:00.692 "block_size": 512, 00:16:00.692 "num_blocks": 65536, 00:16:00.692 "uuid": "8678d897-bec2-4520-b541-e5cbb3af7513", 00:16:00.692 "assigned_rate_limits": { 00:16:00.692 "rw_ios_per_sec": 0, 00:16:00.692 "rw_mbytes_per_sec": 0, 00:16:00.692 "r_mbytes_per_sec": 0, 00:16:00.692 "w_mbytes_per_sec": 0 00:16:00.692 }, 00:16:00.692 "claimed": true, 00:16:00.692 "claim_type": "exclusive_write", 00:16:00.692 "zoned": false, 00:16:00.692 "supported_io_types": { 00:16:00.692 "read": true, 00:16:00.692 "write": true, 00:16:00.692 "unmap": true, 00:16:00.692 "flush": true, 00:16:00.692 "reset": true, 00:16:00.692 "nvme_admin": false, 00:16:00.692 "nvme_io": false, 00:16:00.692 "nvme_io_md": false, 00:16:00.692 "write_zeroes": true, 00:16:00.692 "zcopy": true, 00:16:00.692 "get_zone_info": false, 00:16:00.692 "zone_management": false, 00:16:00.692 "zone_append": false, 00:16:00.692 "compare": false, 00:16:00.692 "compare_and_write": false, 00:16:00.692 "abort": true, 00:16:00.692 "seek_hole": false, 00:16:00.692 "seek_data": false, 00:16:00.692 "copy": true, 00:16:00.692 "nvme_iov_md": false 00:16:00.692 }, 00:16:00.692 "memory_domains": [ 00:16:00.692 { 00:16:00.692 "dma_device_id": "system", 00:16:00.692 "dma_device_type": 1 00:16:00.692 }, 00:16:00.692 { 00:16:00.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.692 "dma_device_type": 2 00:16:00.692 } 00:16:00.692 ], 00:16:00.692 "driver_specific": {} 00:16:00.692 }' 00:16:00.692 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.951 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:00.951 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:00.951 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.951 10:23:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:00.951 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:00.951 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.951 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:00.951 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:00.951 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.210 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.210 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:01.210 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:01.210 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:01.210 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:01.468 "name": "BaseBdev3", 00:16:01.468 "aliases": [ 00:16:01.468 "17219f4c-6d43-4165-b361-13ac6d2608cd" 00:16:01.468 ], 00:16:01.468 "product_name": "Malloc disk", 00:16:01.468 "block_size": 512, 00:16:01.468 "num_blocks": 65536, 00:16:01.468 "uuid": "17219f4c-6d43-4165-b361-13ac6d2608cd", 00:16:01.468 "assigned_rate_limits": { 00:16:01.468 "rw_ios_per_sec": 0, 00:16:01.468 "rw_mbytes_per_sec": 0, 00:16:01.468 "r_mbytes_per_sec": 0, 00:16:01.468 "w_mbytes_per_sec": 0 00:16:01.468 }, 00:16:01.468 "claimed": true, 00:16:01.468 "claim_type": "exclusive_write", 00:16:01.468 "zoned": false, 00:16:01.468 "supported_io_types": { 00:16:01.468 "read": true, 00:16:01.468 "write": true, 00:16:01.468 "unmap": true, 00:16:01.468 "flush": true, 00:16:01.468 "reset": true, 00:16:01.468 "nvme_admin": false, 00:16:01.468 "nvme_io": false, 00:16:01.468 "nvme_io_md": false, 00:16:01.468 "write_zeroes": true, 00:16:01.468 "zcopy": true, 00:16:01.468 "get_zone_info": false, 00:16:01.468 "zone_management": false, 00:16:01.468 "zone_append": false, 00:16:01.468 "compare": false, 00:16:01.468 "compare_and_write": false, 00:16:01.468 "abort": true, 00:16:01.468 "seek_hole": false, 00:16:01.468 "seek_data": false, 00:16:01.468 "copy": true, 00:16:01.468 "nvme_iov_md": false 00:16:01.468 }, 00:16:01.468 "memory_domains": [ 00:16:01.468 { 00:16:01.468 "dma_device_id": "system", 00:16:01.468 "dma_device_type": 1 00:16:01.468 }, 00:16:01.468 { 00:16:01.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.468 "dma_device_type": 2 00:16:01.468 } 00:16:01.468 ], 00:16:01.468 "driver_specific": {} 00:16:01.468 }' 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:01.468 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.727 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:01.727 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:01.727 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.727 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:01.727 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:01.727 10:23:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:01.986 [2024-07-15 10:23:39.039152] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:01.986 [2024-07-15 10:23:39.039179] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:01.986 [2024-07-15 10:23:39.039220] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.986 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.244 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.244 "name": "Existed_Raid", 00:16:02.244 "uuid": "158c0b3d-b06a-4d81-b45d-f6bae473d483", 00:16:02.244 "strip_size_kb": 64, 00:16:02.244 "state": "offline", 00:16:02.244 "raid_level": "concat", 00:16:02.244 "superblock": true, 00:16:02.244 "num_base_bdevs": 3, 00:16:02.244 "num_base_bdevs_discovered": 2, 00:16:02.244 "num_base_bdevs_operational": 2, 00:16:02.244 "base_bdevs_list": [ 00:16:02.244 { 00:16:02.244 "name": null, 00:16:02.244 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.244 "is_configured": false, 00:16:02.244 "data_offset": 2048, 00:16:02.244 "data_size": 63488 00:16:02.244 }, 00:16:02.244 { 00:16:02.244 "name": "BaseBdev2", 00:16:02.244 "uuid": "8678d897-bec2-4520-b541-e5cbb3af7513", 00:16:02.244 "is_configured": true, 00:16:02.244 "data_offset": 2048, 00:16:02.244 "data_size": 63488 00:16:02.244 }, 00:16:02.244 { 00:16:02.244 "name": "BaseBdev3", 00:16:02.244 "uuid": "17219f4c-6d43-4165-b361-13ac6d2608cd", 00:16:02.244 "is_configured": true, 00:16:02.244 "data_offset": 2048, 00:16:02.244 "data_size": 63488 00:16:02.244 } 00:16:02.244 ] 00:16:02.245 }' 00:16:02.245 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.245 10:23:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.811 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:02.811 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:02.811 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.811 10:23:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:03.069 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:03.069 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:03.069 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:03.327 [2024-07-15 10:23:40.376626] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:03.327 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:03.327 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:03.327 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.327 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:03.585 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:03.585 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:03.585 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:03.843 [2024-07-15 10:23:40.898434] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:03.843 [2024-07-15 10:23:40.898477] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x811400 name Existed_Raid, state offline 00:16:03.843 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:03.843 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:03.843 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.843 10:23:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:04.101 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:04.101 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:04.101 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:04.101 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:04.101 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:04.101 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:04.358 BaseBdev2 00:16:04.358 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:04.358 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:04.358 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:04.358 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:04.358 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:04.358 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:04.358 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:04.616 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:04.873 [ 00:16:04.873 { 00:16:04.873 "name": "BaseBdev2", 00:16:04.873 "aliases": [ 00:16:04.873 "f72baa69-937e-4eda-b642-368735835c77" 00:16:04.873 ], 00:16:04.873 "product_name": "Malloc disk", 00:16:04.873 "block_size": 512, 00:16:04.873 "num_blocks": 65536, 00:16:04.873 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:04.873 "assigned_rate_limits": { 00:16:04.873 "rw_ios_per_sec": 0, 00:16:04.873 "rw_mbytes_per_sec": 0, 00:16:04.873 "r_mbytes_per_sec": 0, 00:16:04.873 "w_mbytes_per_sec": 0 00:16:04.873 }, 00:16:04.873 "claimed": false, 00:16:04.873 "zoned": false, 00:16:04.873 "supported_io_types": { 00:16:04.873 "read": true, 00:16:04.873 "write": true, 00:16:04.873 "unmap": true, 00:16:04.873 "flush": true, 00:16:04.873 "reset": true, 00:16:04.873 "nvme_admin": false, 00:16:04.873 "nvme_io": false, 00:16:04.873 "nvme_io_md": false, 00:16:04.873 "write_zeroes": true, 00:16:04.873 "zcopy": true, 00:16:04.873 "get_zone_info": false, 00:16:04.873 "zone_management": false, 00:16:04.873 "zone_append": false, 00:16:04.873 "compare": false, 00:16:04.873 "compare_and_write": false, 00:16:04.873 "abort": true, 00:16:04.873 "seek_hole": false, 00:16:04.873 "seek_data": false, 00:16:04.873 "copy": true, 00:16:04.873 "nvme_iov_md": false 00:16:04.873 }, 00:16:04.873 "memory_domains": [ 00:16:04.873 { 00:16:04.873 "dma_device_id": "system", 00:16:04.873 "dma_device_type": 1 00:16:04.873 }, 00:16:04.873 { 00:16:04.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.873 "dma_device_type": 2 00:16:04.873 } 00:16:04.873 ], 00:16:04.873 "driver_specific": {} 00:16:04.873 } 00:16:04.873 ] 00:16:04.873 10:23:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:04.873 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:04.873 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:04.873 10:23:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:05.130 BaseBdev3 00:16:05.130 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:05.130 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:05.130 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:05.130 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:05.130 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:05.130 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:05.130 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:05.387 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:05.663 [ 00:16:05.663 { 00:16:05.663 "name": "BaseBdev3", 00:16:05.663 "aliases": [ 00:16:05.664 "17b13b2c-8db1-4198-9fb9-5f05da0517a5" 00:16:05.664 ], 00:16:05.664 "product_name": "Malloc disk", 00:16:05.664 "block_size": 512, 00:16:05.664 "num_blocks": 65536, 00:16:05.664 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:05.664 "assigned_rate_limits": { 00:16:05.664 "rw_ios_per_sec": 0, 00:16:05.664 "rw_mbytes_per_sec": 0, 00:16:05.664 "r_mbytes_per_sec": 0, 00:16:05.664 "w_mbytes_per_sec": 0 00:16:05.664 }, 00:16:05.664 "claimed": false, 00:16:05.664 "zoned": false, 00:16:05.664 "supported_io_types": { 00:16:05.664 "read": true, 00:16:05.664 "write": true, 00:16:05.664 "unmap": true, 00:16:05.664 "flush": true, 00:16:05.664 "reset": true, 00:16:05.664 "nvme_admin": false, 00:16:05.664 "nvme_io": false, 00:16:05.664 "nvme_io_md": false, 00:16:05.664 "write_zeroes": true, 00:16:05.664 "zcopy": true, 00:16:05.664 "get_zone_info": false, 00:16:05.664 "zone_management": false, 00:16:05.664 "zone_append": false, 00:16:05.664 "compare": false, 00:16:05.664 "compare_and_write": false, 00:16:05.664 "abort": true, 00:16:05.664 "seek_hole": false, 00:16:05.664 "seek_data": false, 00:16:05.664 "copy": true, 00:16:05.664 "nvme_iov_md": false 00:16:05.664 }, 00:16:05.664 "memory_domains": [ 00:16:05.664 { 00:16:05.664 "dma_device_id": "system", 00:16:05.664 "dma_device_type": 1 00:16:05.664 }, 00:16:05.664 { 00:16:05.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.664 "dma_device_type": 2 00:16:05.664 } 00:16:05.664 ], 00:16:05.664 "driver_specific": {} 00:16:05.664 } 00:16:05.664 ] 00:16:05.664 10:23:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:05.664 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:05.664 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:05.664 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:05.921 [2024-07-15 10:23:42.872862] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:05.921 [2024-07-15 10:23:42.872903] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:05.921 [2024-07-15 10:23:42.872924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:05.921 [2024-07-15 10:23:42.874343] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.921 10:23:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.186 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.186 "name": "Existed_Raid", 00:16:06.186 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:06.186 "strip_size_kb": 64, 00:16:06.186 "state": "configuring", 00:16:06.186 "raid_level": "concat", 00:16:06.186 "superblock": true, 00:16:06.186 "num_base_bdevs": 3, 00:16:06.186 "num_base_bdevs_discovered": 2, 00:16:06.186 "num_base_bdevs_operational": 3, 00:16:06.186 "base_bdevs_list": [ 00:16:06.186 { 00:16:06.186 "name": "BaseBdev1", 00:16:06.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.186 "is_configured": false, 00:16:06.186 "data_offset": 0, 00:16:06.186 "data_size": 0 00:16:06.186 }, 00:16:06.186 { 00:16:06.186 "name": "BaseBdev2", 00:16:06.186 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:06.186 "is_configured": true, 00:16:06.186 "data_offset": 2048, 00:16:06.186 "data_size": 63488 00:16:06.186 }, 00:16:06.186 { 00:16:06.186 "name": "BaseBdev3", 00:16:06.186 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:06.186 "is_configured": true, 00:16:06.186 "data_offset": 2048, 00:16:06.186 "data_size": 63488 00:16:06.186 } 00:16:06.186 ] 00:16:06.186 }' 00:16:06.186 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.186 10:23:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:06.757 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:07.013 [2024-07-15 10:23:43.971751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.013 10:23:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.270 10:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.270 "name": "Existed_Raid", 00:16:07.270 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:07.270 "strip_size_kb": 64, 00:16:07.270 "state": "configuring", 00:16:07.270 "raid_level": "concat", 00:16:07.270 "superblock": true, 00:16:07.270 "num_base_bdevs": 3, 00:16:07.270 "num_base_bdevs_discovered": 1, 00:16:07.270 "num_base_bdevs_operational": 3, 00:16:07.270 "base_bdevs_list": [ 00:16:07.270 { 00:16:07.270 "name": "BaseBdev1", 00:16:07.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:07.270 "is_configured": false, 00:16:07.270 "data_offset": 0, 00:16:07.270 "data_size": 0 00:16:07.270 }, 00:16:07.270 { 00:16:07.270 "name": null, 00:16:07.270 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:07.270 "is_configured": false, 00:16:07.270 "data_offset": 2048, 00:16:07.270 "data_size": 63488 00:16:07.270 }, 00:16:07.270 { 00:16:07.270 "name": "BaseBdev3", 00:16:07.270 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:07.270 "is_configured": true, 00:16:07.270 "data_offset": 2048, 00:16:07.270 "data_size": 63488 00:16:07.270 } 00:16:07.270 ] 00:16:07.270 }' 00:16:07.270 10:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.270 10:23:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.834 10:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:07.834 10:23:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.090 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:08.090 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:08.348 [2024-07-15 10:23:45.311754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:08.348 BaseBdev1 00:16:08.348 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:08.348 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:08.348 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:08.348 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:08.348 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:08.348 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:08.348 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:08.604 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:08.604 [ 00:16:08.604 { 00:16:08.604 "name": "BaseBdev1", 00:16:08.604 "aliases": [ 00:16:08.604 "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f" 00:16:08.604 ], 00:16:08.604 "product_name": "Malloc disk", 00:16:08.604 "block_size": 512, 00:16:08.604 "num_blocks": 65536, 00:16:08.604 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:08.604 "assigned_rate_limits": { 00:16:08.604 "rw_ios_per_sec": 0, 00:16:08.604 "rw_mbytes_per_sec": 0, 00:16:08.604 "r_mbytes_per_sec": 0, 00:16:08.604 "w_mbytes_per_sec": 0 00:16:08.604 }, 00:16:08.604 "claimed": true, 00:16:08.604 "claim_type": "exclusive_write", 00:16:08.604 "zoned": false, 00:16:08.604 "supported_io_types": { 00:16:08.604 "read": true, 00:16:08.604 "write": true, 00:16:08.604 "unmap": true, 00:16:08.604 "flush": true, 00:16:08.604 "reset": true, 00:16:08.604 "nvme_admin": false, 00:16:08.604 "nvme_io": false, 00:16:08.604 "nvme_io_md": false, 00:16:08.604 "write_zeroes": true, 00:16:08.604 "zcopy": true, 00:16:08.604 "get_zone_info": false, 00:16:08.604 "zone_management": false, 00:16:08.604 "zone_append": false, 00:16:08.604 "compare": false, 00:16:08.604 "compare_and_write": false, 00:16:08.604 "abort": true, 00:16:08.604 "seek_hole": false, 00:16:08.604 "seek_data": false, 00:16:08.604 "copy": true, 00:16:08.604 "nvme_iov_md": false 00:16:08.604 }, 00:16:08.604 "memory_domains": [ 00:16:08.604 { 00:16:08.604 "dma_device_id": "system", 00:16:08.604 "dma_device_type": 1 00:16:08.604 }, 00:16:08.604 { 00:16:08.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.604 "dma_device_type": 2 00:16:08.604 } 00:16:08.604 ], 00:16:08.604 "driver_specific": {} 00:16:08.604 } 00:16:08.604 ] 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.861 10:23:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.861 10:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.861 "name": "Existed_Raid", 00:16:08.861 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:08.861 "strip_size_kb": 64, 00:16:08.861 "state": "configuring", 00:16:08.861 "raid_level": "concat", 00:16:08.861 "superblock": true, 00:16:08.861 "num_base_bdevs": 3, 00:16:08.861 "num_base_bdevs_discovered": 2, 00:16:08.861 "num_base_bdevs_operational": 3, 00:16:08.861 "base_bdevs_list": [ 00:16:08.861 { 00:16:08.861 "name": "BaseBdev1", 00:16:08.861 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:08.861 "is_configured": true, 00:16:08.861 "data_offset": 2048, 00:16:08.861 "data_size": 63488 00:16:08.861 }, 00:16:08.861 { 00:16:08.861 "name": null, 00:16:08.861 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:08.861 "is_configured": false, 00:16:08.861 "data_offset": 2048, 00:16:08.861 "data_size": 63488 00:16:08.861 }, 00:16:08.861 { 00:16:08.861 "name": "BaseBdev3", 00:16:08.861 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:08.861 "is_configured": true, 00:16:08.861 "data_offset": 2048, 00:16:08.861 "data_size": 63488 00:16:08.861 } 00:16:08.861 ] 00:16:08.861 }' 00:16:08.861 10:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.861 10:23:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:09.791 10:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.791 10:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:09.791 10:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:09.791 10:23:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:10.048 [2024-07-15 10:23:47.136633] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.048 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.305 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.305 "name": "Existed_Raid", 00:16:10.305 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:10.305 "strip_size_kb": 64, 00:16:10.305 "state": "configuring", 00:16:10.305 "raid_level": "concat", 00:16:10.305 "superblock": true, 00:16:10.305 "num_base_bdevs": 3, 00:16:10.305 "num_base_bdevs_discovered": 1, 00:16:10.305 "num_base_bdevs_operational": 3, 00:16:10.305 "base_bdevs_list": [ 00:16:10.305 { 00:16:10.305 "name": "BaseBdev1", 00:16:10.305 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:10.305 "is_configured": true, 00:16:10.305 "data_offset": 2048, 00:16:10.305 "data_size": 63488 00:16:10.305 }, 00:16:10.305 { 00:16:10.306 "name": null, 00:16:10.306 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:10.306 "is_configured": false, 00:16:10.306 "data_offset": 2048, 00:16:10.306 "data_size": 63488 00:16:10.306 }, 00:16:10.306 { 00:16:10.306 "name": null, 00:16:10.306 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:10.306 "is_configured": false, 00:16:10.306 "data_offset": 2048, 00:16:10.306 "data_size": 63488 00:16:10.306 } 00:16:10.306 ] 00:16:10.306 }' 00:16:10.306 10:23:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.306 10:23:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:10.868 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.868 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:11.123 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:11.123 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:11.379 [2024-07-15 10:23:48.424064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:11.379 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.635 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.635 "name": "Existed_Raid", 00:16:11.635 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:11.635 "strip_size_kb": 64, 00:16:11.635 "state": "configuring", 00:16:11.635 "raid_level": "concat", 00:16:11.635 "superblock": true, 00:16:11.635 "num_base_bdevs": 3, 00:16:11.635 "num_base_bdevs_discovered": 2, 00:16:11.635 "num_base_bdevs_operational": 3, 00:16:11.635 "base_bdevs_list": [ 00:16:11.635 { 00:16:11.635 "name": "BaseBdev1", 00:16:11.635 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:11.635 "is_configured": true, 00:16:11.635 "data_offset": 2048, 00:16:11.635 "data_size": 63488 00:16:11.635 }, 00:16:11.635 { 00:16:11.635 "name": null, 00:16:11.635 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:11.635 "is_configured": false, 00:16:11.635 "data_offset": 2048, 00:16:11.635 "data_size": 63488 00:16:11.635 }, 00:16:11.635 { 00:16:11.635 "name": "BaseBdev3", 00:16:11.635 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:11.635 "is_configured": true, 00:16:11.635 "data_offset": 2048, 00:16:11.635 "data_size": 63488 00:16:11.635 } 00:16:11.635 ] 00:16:11.635 }' 00:16:11.636 10:23:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.636 10:23:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:12.198 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.198 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:12.455 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:12.455 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:12.713 [2024-07-15 10:23:49.771663] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.713 10:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.970 10:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.970 "name": "Existed_Raid", 00:16:12.970 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:12.970 "strip_size_kb": 64, 00:16:12.970 "state": "configuring", 00:16:12.970 "raid_level": "concat", 00:16:12.970 "superblock": true, 00:16:12.970 "num_base_bdevs": 3, 00:16:12.970 "num_base_bdevs_discovered": 1, 00:16:12.970 "num_base_bdevs_operational": 3, 00:16:12.970 "base_bdevs_list": [ 00:16:12.970 { 00:16:12.970 "name": null, 00:16:12.970 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:12.970 "is_configured": false, 00:16:12.970 "data_offset": 2048, 00:16:12.970 "data_size": 63488 00:16:12.970 }, 00:16:12.970 { 00:16:12.970 "name": null, 00:16:12.970 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:12.970 "is_configured": false, 00:16:12.970 "data_offset": 2048, 00:16:12.970 "data_size": 63488 00:16:12.970 }, 00:16:12.970 { 00:16:12.970 "name": "BaseBdev3", 00:16:12.970 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:12.970 "is_configured": true, 00:16:12.970 "data_offset": 2048, 00:16:12.970 "data_size": 63488 00:16:12.970 } 00:16:12.970 ] 00:16:12.970 }' 00:16:12.970 10:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.971 10:23:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:13.535 10:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.535 10:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:13.793 10:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:13.793 10:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:14.050 [2024-07-15 10:23:51.057613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.050 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.307 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.307 "name": "Existed_Raid", 00:16:14.307 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:14.307 "strip_size_kb": 64, 00:16:14.307 "state": "configuring", 00:16:14.307 "raid_level": "concat", 00:16:14.307 "superblock": true, 00:16:14.307 "num_base_bdevs": 3, 00:16:14.307 "num_base_bdevs_discovered": 2, 00:16:14.308 "num_base_bdevs_operational": 3, 00:16:14.308 "base_bdevs_list": [ 00:16:14.308 { 00:16:14.308 "name": null, 00:16:14.308 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:14.308 "is_configured": false, 00:16:14.308 "data_offset": 2048, 00:16:14.308 "data_size": 63488 00:16:14.308 }, 00:16:14.308 { 00:16:14.308 "name": "BaseBdev2", 00:16:14.308 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:14.308 "is_configured": true, 00:16:14.308 "data_offset": 2048, 00:16:14.308 "data_size": 63488 00:16:14.308 }, 00:16:14.308 { 00:16:14.308 "name": "BaseBdev3", 00:16:14.308 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:14.308 "is_configured": true, 00:16:14.308 "data_offset": 2048, 00:16:14.308 "data_size": 63488 00:16:14.308 } 00:16:14.308 ] 00:16:14.308 }' 00:16:14.308 10:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.308 10:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:15.303 10:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.303 10:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:15.303 10:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:15.303 10:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.303 10:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:15.562 10:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f 00:16:15.821 [2024-07-15 10:23:52.822384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:15.821 [2024-07-15 10:23:52.822540] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x80ff50 00:16:15.821 [2024-07-15 10:23:52.822553] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:15.821 [2024-07-15 10:23:52.822727] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x516940 00:16:15.821 [2024-07-15 10:23:52.822842] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x80ff50 00:16:15.821 [2024-07-15 10:23:52.822852] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x80ff50 00:16:15.821 [2024-07-15 10:23:52.822952] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:15.821 NewBaseBdev 00:16:15.821 10:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:15.821 10:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:15.821 10:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.821 10:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:15.821 10:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.821 10:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.821 10:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:16.081 [ 00:16:16.081 { 00:16:16.081 "name": "NewBaseBdev", 00:16:16.081 "aliases": [ 00:16:16.081 "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f" 00:16:16.081 ], 00:16:16.081 "product_name": "Malloc disk", 00:16:16.081 "block_size": 512, 00:16:16.081 "num_blocks": 65536, 00:16:16.081 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:16.081 "assigned_rate_limits": { 00:16:16.081 "rw_ios_per_sec": 0, 00:16:16.081 "rw_mbytes_per_sec": 0, 00:16:16.081 "r_mbytes_per_sec": 0, 00:16:16.081 "w_mbytes_per_sec": 0 00:16:16.081 }, 00:16:16.081 "claimed": true, 00:16:16.081 "claim_type": "exclusive_write", 00:16:16.081 "zoned": false, 00:16:16.081 "supported_io_types": { 00:16:16.081 "read": true, 00:16:16.081 "write": true, 00:16:16.081 "unmap": true, 00:16:16.081 "flush": true, 00:16:16.081 "reset": true, 00:16:16.081 "nvme_admin": false, 00:16:16.081 "nvme_io": false, 00:16:16.081 "nvme_io_md": false, 00:16:16.081 "write_zeroes": true, 00:16:16.081 "zcopy": true, 00:16:16.081 "get_zone_info": false, 00:16:16.081 "zone_management": false, 00:16:16.081 "zone_append": false, 00:16:16.081 "compare": false, 00:16:16.081 "compare_and_write": false, 00:16:16.081 "abort": true, 00:16:16.081 "seek_hole": false, 00:16:16.081 "seek_data": false, 00:16:16.081 "copy": true, 00:16:16.081 "nvme_iov_md": false 00:16:16.081 }, 00:16:16.081 "memory_domains": [ 00:16:16.081 { 00:16:16.081 "dma_device_id": "system", 00:16:16.081 "dma_device_type": 1 00:16:16.081 }, 00:16:16.081 { 00:16:16.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.081 "dma_device_type": 2 00:16:16.081 } 00:16:16.081 ], 00:16:16.081 "driver_specific": {} 00:16:16.081 } 00:16:16.081 ] 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.081 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.340 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.340 "name": "Existed_Raid", 00:16:16.340 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:16.340 "strip_size_kb": 64, 00:16:16.340 "state": "online", 00:16:16.340 "raid_level": "concat", 00:16:16.340 "superblock": true, 00:16:16.340 "num_base_bdevs": 3, 00:16:16.340 "num_base_bdevs_discovered": 3, 00:16:16.340 "num_base_bdevs_operational": 3, 00:16:16.340 "base_bdevs_list": [ 00:16:16.340 { 00:16:16.340 "name": "NewBaseBdev", 00:16:16.340 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:16.340 "is_configured": true, 00:16:16.340 "data_offset": 2048, 00:16:16.340 "data_size": 63488 00:16:16.340 }, 00:16:16.340 { 00:16:16.340 "name": "BaseBdev2", 00:16:16.340 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:16.340 "is_configured": true, 00:16:16.340 "data_offset": 2048, 00:16:16.340 "data_size": 63488 00:16:16.340 }, 00:16:16.340 { 00:16:16.340 "name": "BaseBdev3", 00:16:16.340 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:16.340 "is_configured": true, 00:16:16.340 "data_offset": 2048, 00:16:16.340 "data_size": 63488 00:16:16.340 } 00:16:16.340 ] 00:16:16.340 }' 00:16:16.340 10:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.340 10:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:16.908 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:17.167 [2024-07-15 10:23:54.258485] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:17.167 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:17.167 "name": "Existed_Raid", 00:16:17.167 "aliases": [ 00:16:17.167 "8a9bcece-f767-4591-b2f3-deb8ea101743" 00:16:17.167 ], 00:16:17.167 "product_name": "Raid Volume", 00:16:17.167 "block_size": 512, 00:16:17.167 "num_blocks": 190464, 00:16:17.167 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:17.167 "assigned_rate_limits": { 00:16:17.167 "rw_ios_per_sec": 0, 00:16:17.167 "rw_mbytes_per_sec": 0, 00:16:17.167 "r_mbytes_per_sec": 0, 00:16:17.167 "w_mbytes_per_sec": 0 00:16:17.167 }, 00:16:17.167 "claimed": false, 00:16:17.167 "zoned": false, 00:16:17.167 "supported_io_types": { 00:16:17.167 "read": true, 00:16:17.167 "write": true, 00:16:17.167 "unmap": true, 00:16:17.167 "flush": true, 00:16:17.167 "reset": true, 00:16:17.167 "nvme_admin": false, 00:16:17.167 "nvme_io": false, 00:16:17.167 "nvme_io_md": false, 00:16:17.167 "write_zeroes": true, 00:16:17.167 "zcopy": false, 00:16:17.167 "get_zone_info": false, 00:16:17.167 "zone_management": false, 00:16:17.167 "zone_append": false, 00:16:17.167 "compare": false, 00:16:17.167 "compare_and_write": false, 00:16:17.167 "abort": false, 00:16:17.167 "seek_hole": false, 00:16:17.167 "seek_data": false, 00:16:17.167 "copy": false, 00:16:17.167 "nvme_iov_md": false 00:16:17.167 }, 00:16:17.167 "memory_domains": [ 00:16:17.167 { 00:16:17.167 "dma_device_id": "system", 00:16:17.167 "dma_device_type": 1 00:16:17.167 }, 00:16:17.167 { 00:16:17.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.167 "dma_device_type": 2 00:16:17.167 }, 00:16:17.167 { 00:16:17.167 "dma_device_id": "system", 00:16:17.167 "dma_device_type": 1 00:16:17.167 }, 00:16:17.167 { 00:16:17.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.167 "dma_device_type": 2 00:16:17.167 }, 00:16:17.167 { 00:16:17.167 "dma_device_id": "system", 00:16:17.167 "dma_device_type": 1 00:16:17.167 }, 00:16:17.167 { 00:16:17.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.167 "dma_device_type": 2 00:16:17.167 } 00:16:17.167 ], 00:16:17.167 "driver_specific": { 00:16:17.167 "raid": { 00:16:17.167 "uuid": "8a9bcece-f767-4591-b2f3-deb8ea101743", 00:16:17.167 "strip_size_kb": 64, 00:16:17.167 "state": "online", 00:16:17.167 "raid_level": "concat", 00:16:17.167 "superblock": true, 00:16:17.167 "num_base_bdevs": 3, 00:16:17.167 "num_base_bdevs_discovered": 3, 00:16:17.167 "num_base_bdevs_operational": 3, 00:16:17.167 "base_bdevs_list": [ 00:16:17.167 { 00:16:17.167 "name": "NewBaseBdev", 00:16:17.167 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:17.167 "is_configured": true, 00:16:17.167 "data_offset": 2048, 00:16:17.167 "data_size": 63488 00:16:17.167 }, 00:16:17.167 { 00:16:17.167 "name": "BaseBdev2", 00:16:17.167 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:17.167 "is_configured": true, 00:16:17.167 "data_offset": 2048, 00:16:17.167 "data_size": 63488 00:16:17.167 }, 00:16:17.167 { 00:16:17.167 "name": "BaseBdev3", 00:16:17.167 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:17.167 "is_configured": true, 00:16:17.167 "data_offset": 2048, 00:16:17.167 "data_size": 63488 00:16:17.167 } 00:16:17.167 ] 00:16:17.167 } 00:16:17.167 } 00:16:17.167 }' 00:16:17.167 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:17.167 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:17.167 BaseBdev2 00:16:17.167 BaseBdev3' 00:16:17.167 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.167 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:17.167 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.735 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.735 "name": "NewBaseBdev", 00:16:17.735 "aliases": [ 00:16:17.735 "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f" 00:16:17.735 ], 00:16:17.735 "product_name": "Malloc disk", 00:16:17.735 "block_size": 512, 00:16:17.735 "num_blocks": 65536, 00:16:17.735 "uuid": "7b162452-e9c5-4a1a-94bf-ed5fc98b8f9f", 00:16:17.735 "assigned_rate_limits": { 00:16:17.735 "rw_ios_per_sec": 0, 00:16:17.735 "rw_mbytes_per_sec": 0, 00:16:17.735 "r_mbytes_per_sec": 0, 00:16:17.735 "w_mbytes_per_sec": 0 00:16:17.735 }, 00:16:17.735 "claimed": true, 00:16:17.735 "claim_type": "exclusive_write", 00:16:17.735 "zoned": false, 00:16:17.735 "supported_io_types": { 00:16:17.735 "read": true, 00:16:17.735 "write": true, 00:16:17.735 "unmap": true, 00:16:17.735 "flush": true, 00:16:17.735 "reset": true, 00:16:17.735 "nvme_admin": false, 00:16:17.735 "nvme_io": false, 00:16:17.735 "nvme_io_md": false, 00:16:17.735 "write_zeroes": true, 00:16:17.735 "zcopy": true, 00:16:17.735 "get_zone_info": false, 00:16:17.735 "zone_management": false, 00:16:17.735 "zone_append": false, 00:16:17.735 "compare": false, 00:16:17.735 "compare_and_write": false, 00:16:17.735 "abort": true, 00:16:17.735 "seek_hole": false, 00:16:17.735 "seek_data": false, 00:16:17.735 "copy": true, 00:16:17.735 "nvme_iov_md": false 00:16:17.735 }, 00:16:17.735 "memory_domains": [ 00:16:17.735 { 00:16:17.735 "dma_device_id": "system", 00:16:17.735 "dma_device_type": 1 00:16:17.735 }, 00:16:17.735 { 00:16:17.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.735 "dma_device_type": 2 00:16:17.735 } 00:16:17.735 ], 00:16:17.735 "driver_specific": {} 00:16:17.735 }' 00:16:17.735 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.735 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.994 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.994 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.994 10:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.994 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.994 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.994 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.994 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.994 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.994 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.254 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.254 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.254 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:18.254 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.513 "name": "BaseBdev2", 00:16:18.513 "aliases": [ 00:16:18.513 "f72baa69-937e-4eda-b642-368735835c77" 00:16:18.513 ], 00:16:18.513 "product_name": "Malloc disk", 00:16:18.513 "block_size": 512, 00:16:18.513 "num_blocks": 65536, 00:16:18.513 "uuid": "f72baa69-937e-4eda-b642-368735835c77", 00:16:18.513 "assigned_rate_limits": { 00:16:18.513 "rw_ios_per_sec": 0, 00:16:18.513 "rw_mbytes_per_sec": 0, 00:16:18.513 "r_mbytes_per_sec": 0, 00:16:18.513 "w_mbytes_per_sec": 0 00:16:18.513 }, 00:16:18.513 "claimed": true, 00:16:18.513 "claim_type": "exclusive_write", 00:16:18.513 "zoned": false, 00:16:18.513 "supported_io_types": { 00:16:18.513 "read": true, 00:16:18.513 "write": true, 00:16:18.513 "unmap": true, 00:16:18.513 "flush": true, 00:16:18.513 "reset": true, 00:16:18.513 "nvme_admin": false, 00:16:18.513 "nvme_io": false, 00:16:18.513 "nvme_io_md": false, 00:16:18.513 "write_zeroes": true, 00:16:18.513 "zcopy": true, 00:16:18.513 "get_zone_info": false, 00:16:18.513 "zone_management": false, 00:16:18.513 "zone_append": false, 00:16:18.513 "compare": false, 00:16:18.513 "compare_and_write": false, 00:16:18.513 "abort": true, 00:16:18.513 "seek_hole": false, 00:16:18.513 "seek_data": false, 00:16:18.513 "copy": true, 00:16:18.513 "nvme_iov_md": false 00:16:18.513 }, 00:16:18.513 "memory_domains": [ 00:16:18.513 { 00:16:18.513 "dma_device_id": "system", 00:16:18.513 "dma_device_type": 1 00:16:18.513 }, 00:16:18.513 { 00:16:18.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.513 "dma_device_type": 2 00:16:18.513 } 00:16:18.513 ], 00:16:18.513 "driver_specific": {} 00:16:18.513 }' 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.513 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.772 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.772 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.772 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.772 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.772 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:18.772 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.772 10:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:19.341 "name": "BaseBdev3", 00:16:19.341 "aliases": [ 00:16:19.341 "17b13b2c-8db1-4198-9fb9-5f05da0517a5" 00:16:19.341 ], 00:16:19.341 "product_name": "Malloc disk", 00:16:19.341 "block_size": 512, 00:16:19.341 "num_blocks": 65536, 00:16:19.341 "uuid": "17b13b2c-8db1-4198-9fb9-5f05da0517a5", 00:16:19.341 "assigned_rate_limits": { 00:16:19.341 "rw_ios_per_sec": 0, 00:16:19.341 "rw_mbytes_per_sec": 0, 00:16:19.341 "r_mbytes_per_sec": 0, 00:16:19.341 "w_mbytes_per_sec": 0 00:16:19.341 }, 00:16:19.341 "claimed": true, 00:16:19.341 "claim_type": "exclusive_write", 00:16:19.341 "zoned": false, 00:16:19.341 "supported_io_types": { 00:16:19.341 "read": true, 00:16:19.341 "write": true, 00:16:19.341 "unmap": true, 00:16:19.341 "flush": true, 00:16:19.341 "reset": true, 00:16:19.341 "nvme_admin": false, 00:16:19.341 "nvme_io": false, 00:16:19.341 "nvme_io_md": false, 00:16:19.341 "write_zeroes": true, 00:16:19.341 "zcopy": true, 00:16:19.341 "get_zone_info": false, 00:16:19.341 "zone_management": false, 00:16:19.341 "zone_append": false, 00:16:19.341 "compare": false, 00:16:19.341 "compare_and_write": false, 00:16:19.341 "abort": true, 00:16:19.341 "seek_hole": false, 00:16:19.341 "seek_data": false, 00:16:19.341 "copy": true, 00:16:19.341 "nvme_iov_md": false 00:16:19.341 }, 00:16:19.341 "memory_domains": [ 00:16:19.341 { 00:16:19.341 "dma_device_id": "system", 00:16:19.341 "dma_device_type": 1 00:16:19.341 }, 00:16:19.341 { 00:16:19.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.341 "dma_device_type": 2 00:16:19.341 } 00:16:19.341 ], 00:16:19.341 "driver_specific": {} 00:16:19.341 }' 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:19.341 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.600 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:19.600 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:19.600 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.600 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:19.600 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:19.600 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:19.860 [2024-07-15 10:23:56.813117] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:19.860 [2024-07-15 10:23:56.813142] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:19.860 [2024-07-15 10:23:56.813195] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:19.860 [2024-07-15 10:23:56.813246] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:19.860 [2024-07-15 10:23:56.813258] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x80ff50 name Existed_Raid, state offline 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 512196 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 512196 ']' 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 512196 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 512196 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 512196' 00:16:19.860 killing process with pid 512196 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 512196 00:16:19.860 [2024-07-15 10:23:56.871431] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:19.860 10:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 512196 00:16:19.860 [2024-07-15 10:23:56.897762] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:20.120 10:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:20.120 00:16:20.120 real 0m28.023s 00:16:20.120 user 0m51.917s 00:16:20.120 sys 0m5.037s 00:16:20.120 10:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:20.120 10:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:20.120 ************************************ 00:16:20.120 END TEST raid_state_function_test_sb 00:16:20.120 ************************************ 00:16:20.120 10:23:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:20.120 10:23:57 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:16:20.120 10:23:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:20.120 10:23:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:20.120 10:23:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:20.120 ************************************ 00:16:20.120 START TEST raid_superblock_test 00:16:20.120 ************************************ 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=516466 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 516466 /var/tmp/spdk-raid.sock 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 516466 ']' 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:20.120 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:20.120 10:23:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.120 [2024-07-15 10:23:57.239355] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:16:20.120 [2024-07-15 10:23:57.239421] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid516466 ] 00:16:20.379 [2024-07-15 10:23:57.369471] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:20.379 [2024-07-15 10:23:57.475501] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:20.379 [2024-07-15 10:23:57.547479] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:20.379 [2024-07-15 10:23:57.547516] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:20.947 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:21.206 malloc1 00:16:21.206 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:21.465 [2024-07-15 10:23:58.473507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:21.465 [2024-07-15 10:23:58.473552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.465 [2024-07-15 10:23:58.473574] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e2e570 00:16:21.465 [2024-07-15 10:23:58.473586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.465 [2024-07-15 10:23:58.475292] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.465 [2024-07-15 10:23:58.475320] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:21.465 pt1 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:21.465 10:23:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:22.033 malloc2 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:22.033 [2024-07-15 10:23:59.157299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:22.033 [2024-07-15 10:23:59.157346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.033 [2024-07-15 10:23:59.157363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e2f970 00:16:22.033 [2024-07-15 10:23:59.157375] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.033 [2024-07-15 10:23:59.159018] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.033 [2024-07-15 10:23:59.159047] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:22.033 pt2 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:22.033 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:22.293 malloc3 00:16:22.293 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:22.551 [2024-07-15 10:23:59.555959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:22.551 [2024-07-15 10:23:59.556006] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:22.552 [2024-07-15 10:23:59.556023] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fc6340 00:16:22.552 [2024-07-15 10:23:59.556036] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:22.552 [2024-07-15 10:23:59.557617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:22.552 [2024-07-15 10:23:59.557645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:22.552 pt3 00:16:22.552 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:22.552 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:22.552 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:22.810 [2024-07-15 10:23:59.792605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:22.810 [2024-07-15 10:23:59.793942] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:22.810 [2024-07-15 10:23:59.793997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:22.810 [2024-07-15 10:23:59.794145] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e26ea0 00:16:22.810 [2024-07-15 10:23:59.794156] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:22.810 [2024-07-15 10:23:59.794358] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e2e240 00:16:22.810 [2024-07-15 10:23:59.794499] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e26ea0 00:16:22.810 [2024-07-15 10:23:59.794509] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e26ea0 00:16:22.810 [2024-07-15 10:23:59.794606] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.810 10:23:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:23.376 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.376 "name": "raid_bdev1", 00:16:23.376 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:23.376 "strip_size_kb": 64, 00:16:23.376 "state": "online", 00:16:23.376 "raid_level": "concat", 00:16:23.376 "superblock": true, 00:16:23.376 "num_base_bdevs": 3, 00:16:23.376 "num_base_bdevs_discovered": 3, 00:16:23.376 "num_base_bdevs_operational": 3, 00:16:23.376 "base_bdevs_list": [ 00:16:23.376 { 00:16:23.376 "name": "pt1", 00:16:23.376 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:23.376 "is_configured": true, 00:16:23.376 "data_offset": 2048, 00:16:23.376 "data_size": 63488 00:16:23.376 }, 00:16:23.376 { 00:16:23.376 "name": "pt2", 00:16:23.376 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:23.376 "is_configured": true, 00:16:23.376 "data_offset": 2048, 00:16:23.376 "data_size": 63488 00:16:23.376 }, 00:16:23.376 { 00:16:23.376 "name": "pt3", 00:16:23.376 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:23.376 "is_configured": true, 00:16:23.376 "data_offset": 2048, 00:16:23.376 "data_size": 63488 00:16:23.376 } 00:16:23.376 ] 00:16:23.376 }' 00:16:23.376 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.376 10:24:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:23.938 10:24:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:24.195 [2024-07-15 10:24:01.148453] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:24.195 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:24.195 "name": "raid_bdev1", 00:16:24.195 "aliases": [ 00:16:24.195 "c17d33be-aa6a-4c37-8856-62e430d4f73d" 00:16:24.195 ], 00:16:24.195 "product_name": "Raid Volume", 00:16:24.195 "block_size": 512, 00:16:24.195 "num_blocks": 190464, 00:16:24.195 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:24.195 "assigned_rate_limits": { 00:16:24.195 "rw_ios_per_sec": 0, 00:16:24.195 "rw_mbytes_per_sec": 0, 00:16:24.195 "r_mbytes_per_sec": 0, 00:16:24.195 "w_mbytes_per_sec": 0 00:16:24.195 }, 00:16:24.195 "claimed": false, 00:16:24.195 "zoned": false, 00:16:24.195 "supported_io_types": { 00:16:24.195 "read": true, 00:16:24.195 "write": true, 00:16:24.195 "unmap": true, 00:16:24.195 "flush": true, 00:16:24.195 "reset": true, 00:16:24.195 "nvme_admin": false, 00:16:24.195 "nvme_io": false, 00:16:24.195 "nvme_io_md": false, 00:16:24.195 "write_zeroes": true, 00:16:24.195 "zcopy": false, 00:16:24.195 "get_zone_info": false, 00:16:24.195 "zone_management": false, 00:16:24.195 "zone_append": false, 00:16:24.195 "compare": false, 00:16:24.195 "compare_and_write": false, 00:16:24.195 "abort": false, 00:16:24.195 "seek_hole": false, 00:16:24.195 "seek_data": false, 00:16:24.195 "copy": false, 00:16:24.195 "nvme_iov_md": false 00:16:24.195 }, 00:16:24.195 "memory_domains": [ 00:16:24.195 { 00:16:24.195 "dma_device_id": "system", 00:16:24.195 "dma_device_type": 1 00:16:24.195 }, 00:16:24.195 { 00:16:24.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.195 "dma_device_type": 2 00:16:24.195 }, 00:16:24.195 { 00:16:24.195 "dma_device_id": "system", 00:16:24.195 "dma_device_type": 1 00:16:24.195 }, 00:16:24.195 { 00:16:24.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.196 "dma_device_type": 2 00:16:24.196 }, 00:16:24.196 { 00:16:24.196 "dma_device_id": "system", 00:16:24.196 "dma_device_type": 1 00:16:24.196 }, 00:16:24.196 { 00:16:24.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.196 "dma_device_type": 2 00:16:24.196 } 00:16:24.196 ], 00:16:24.196 "driver_specific": { 00:16:24.196 "raid": { 00:16:24.196 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:24.196 "strip_size_kb": 64, 00:16:24.196 "state": "online", 00:16:24.196 "raid_level": "concat", 00:16:24.196 "superblock": true, 00:16:24.196 "num_base_bdevs": 3, 00:16:24.196 "num_base_bdevs_discovered": 3, 00:16:24.196 "num_base_bdevs_operational": 3, 00:16:24.196 "base_bdevs_list": [ 00:16:24.196 { 00:16:24.196 "name": "pt1", 00:16:24.196 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:24.196 "is_configured": true, 00:16:24.196 "data_offset": 2048, 00:16:24.196 "data_size": 63488 00:16:24.196 }, 00:16:24.196 { 00:16:24.196 "name": "pt2", 00:16:24.196 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:24.196 "is_configured": true, 00:16:24.196 "data_offset": 2048, 00:16:24.196 "data_size": 63488 00:16:24.196 }, 00:16:24.196 { 00:16:24.196 "name": "pt3", 00:16:24.196 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:24.196 "is_configured": true, 00:16:24.196 "data_offset": 2048, 00:16:24.196 "data_size": 63488 00:16:24.196 } 00:16:24.196 ] 00:16:24.196 } 00:16:24.196 } 00:16:24.196 }' 00:16:24.196 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:24.196 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:24.196 pt2 00:16:24.196 pt3' 00:16:24.196 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.196 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:24.196 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:24.454 "name": "pt1", 00:16:24.454 "aliases": [ 00:16:24.454 "00000000-0000-0000-0000-000000000001" 00:16:24.454 ], 00:16:24.454 "product_name": "passthru", 00:16:24.454 "block_size": 512, 00:16:24.454 "num_blocks": 65536, 00:16:24.454 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:24.454 "assigned_rate_limits": { 00:16:24.454 "rw_ios_per_sec": 0, 00:16:24.454 "rw_mbytes_per_sec": 0, 00:16:24.454 "r_mbytes_per_sec": 0, 00:16:24.454 "w_mbytes_per_sec": 0 00:16:24.454 }, 00:16:24.454 "claimed": true, 00:16:24.454 "claim_type": "exclusive_write", 00:16:24.454 "zoned": false, 00:16:24.454 "supported_io_types": { 00:16:24.454 "read": true, 00:16:24.454 "write": true, 00:16:24.454 "unmap": true, 00:16:24.454 "flush": true, 00:16:24.454 "reset": true, 00:16:24.454 "nvme_admin": false, 00:16:24.454 "nvme_io": false, 00:16:24.454 "nvme_io_md": false, 00:16:24.454 "write_zeroes": true, 00:16:24.454 "zcopy": true, 00:16:24.454 "get_zone_info": false, 00:16:24.454 "zone_management": false, 00:16:24.454 "zone_append": false, 00:16:24.454 "compare": false, 00:16:24.454 "compare_and_write": false, 00:16:24.454 "abort": true, 00:16:24.454 "seek_hole": false, 00:16:24.454 "seek_data": false, 00:16:24.454 "copy": true, 00:16:24.454 "nvme_iov_md": false 00:16:24.454 }, 00:16:24.454 "memory_domains": [ 00:16:24.454 { 00:16:24.454 "dma_device_id": "system", 00:16:24.454 "dma_device_type": 1 00:16:24.454 }, 00:16:24.454 { 00:16:24.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.454 "dma_device_type": 2 00:16:24.454 } 00:16:24.454 ], 00:16:24.454 "driver_specific": { 00:16:24.454 "passthru": { 00:16:24.454 "name": "pt1", 00:16:24.454 "base_bdev_name": "malloc1" 00:16:24.454 } 00:16:24.454 } 00:16:24.454 }' 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:24.454 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:24.713 10:24:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:24.971 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:24.971 "name": "pt2", 00:16:24.971 "aliases": [ 00:16:24.971 "00000000-0000-0000-0000-000000000002" 00:16:24.971 ], 00:16:24.971 "product_name": "passthru", 00:16:24.971 "block_size": 512, 00:16:24.971 "num_blocks": 65536, 00:16:24.971 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:24.971 "assigned_rate_limits": { 00:16:24.971 "rw_ios_per_sec": 0, 00:16:24.971 "rw_mbytes_per_sec": 0, 00:16:24.971 "r_mbytes_per_sec": 0, 00:16:24.971 "w_mbytes_per_sec": 0 00:16:24.971 }, 00:16:24.971 "claimed": true, 00:16:24.971 "claim_type": "exclusive_write", 00:16:24.971 "zoned": false, 00:16:24.972 "supported_io_types": { 00:16:24.972 "read": true, 00:16:24.972 "write": true, 00:16:24.972 "unmap": true, 00:16:24.972 "flush": true, 00:16:24.972 "reset": true, 00:16:24.972 "nvme_admin": false, 00:16:24.972 "nvme_io": false, 00:16:24.972 "nvme_io_md": false, 00:16:24.972 "write_zeroes": true, 00:16:24.972 "zcopy": true, 00:16:24.972 "get_zone_info": false, 00:16:24.972 "zone_management": false, 00:16:24.972 "zone_append": false, 00:16:24.972 "compare": false, 00:16:24.972 "compare_and_write": false, 00:16:24.972 "abort": true, 00:16:24.972 "seek_hole": false, 00:16:24.972 "seek_data": false, 00:16:24.972 "copy": true, 00:16:24.972 "nvme_iov_md": false 00:16:24.972 }, 00:16:24.972 "memory_domains": [ 00:16:24.972 { 00:16:24.972 "dma_device_id": "system", 00:16:24.972 "dma_device_type": 1 00:16:24.972 }, 00:16:24.972 { 00:16:24.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.972 "dma_device_type": 2 00:16:24.972 } 00:16:24.972 ], 00:16:24.972 "driver_specific": { 00:16:24.972 "passthru": { 00:16:24.972 "name": "pt2", 00:16:24.972 "base_bdev_name": "malloc2" 00:16:24.972 } 00:16:24.972 } 00:16:24.972 }' 00:16:24.972 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.972 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:24.972 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:24.972 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:25.231 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:25.489 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:25.489 "name": "pt3", 00:16:25.489 "aliases": [ 00:16:25.489 "00000000-0000-0000-0000-000000000003" 00:16:25.489 ], 00:16:25.489 "product_name": "passthru", 00:16:25.489 "block_size": 512, 00:16:25.489 "num_blocks": 65536, 00:16:25.489 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:25.489 "assigned_rate_limits": { 00:16:25.489 "rw_ios_per_sec": 0, 00:16:25.489 "rw_mbytes_per_sec": 0, 00:16:25.489 "r_mbytes_per_sec": 0, 00:16:25.489 "w_mbytes_per_sec": 0 00:16:25.489 }, 00:16:25.489 "claimed": true, 00:16:25.489 "claim_type": "exclusive_write", 00:16:25.489 "zoned": false, 00:16:25.489 "supported_io_types": { 00:16:25.489 "read": true, 00:16:25.489 "write": true, 00:16:25.489 "unmap": true, 00:16:25.489 "flush": true, 00:16:25.489 "reset": true, 00:16:25.489 "nvme_admin": false, 00:16:25.489 "nvme_io": false, 00:16:25.489 "nvme_io_md": false, 00:16:25.489 "write_zeroes": true, 00:16:25.489 "zcopy": true, 00:16:25.489 "get_zone_info": false, 00:16:25.489 "zone_management": false, 00:16:25.489 "zone_append": false, 00:16:25.489 "compare": false, 00:16:25.489 "compare_and_write": false, 00:16:25.489 "abort": true, 00:16:25.489 "seek_hole": false, 00:16:25.489 "seek_data": false, 00:16:25.489 "copy": true, 00:16:25.489 "nvme_iov_md": false 00:16:25.489 }, 00:16:25.489 "memory_domains": [ 00:16:25.489 { 00:16:25.489 "dma_device_id": "system", 00:16:25.489 "dma_device_type": 1 00:16:25.489 }, 00:16:25.489 { 00:16:25.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.489 "dma_device_type": 2 00:16:25.489 } 00:16:25.489 ], 00:16:25.489 "driver_specific": { 00:16:25.489 "passthru": { 00:16:25.489 "name": "pt3", 00:16:25.489 "base_bdev_name": "malloc3" 00:16:25.489 } 00:16:25.489 } 00:16:25.489 }' 00:16:25.489 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.489 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:25.747 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.005 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.005 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.005 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:26.005 10:24:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:26.270 [2024-07-15 10:24:03.226111] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:26.270 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c17d33be-aa6a-4c37-8856-62e430d4f73d 00:16:26.270 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c17d33be-aa6a-4c37-8856-62e430d4f73d ']' 00:16:26.270 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:26.531 [2024-07-15 10:24:03.474485] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:26.531 [2024-07-15 10:24:03.474509] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:26.531 [2024-07-15 10:24:03.474563] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:26.531 [2024-07-15 10:24:03.474616] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:26.531 [2024-07-15 10:24:03.474628] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e26ea0 name raid_bdev1, state offline 00:16:26.531 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.531 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:26.789 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:26.789 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:26.789 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:26.789 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:26.789 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:26.789 10:24:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:27.354 10:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:27.354 10:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:27.612 10:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:27.612 10:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:27.870 10:24:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:28.128 [2024-07-15 10:24:05.202997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:28.128 [2024-07-15 10:24:05.204333] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:28.128 [2024-07-15 10:24:05.204375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:28.128 [2024-07-15 10:24:05.204422] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:28.128 [2024-07-15 10:24:05.204460] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:28.128 [2024-07-15 10:24:05.204484] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:28.128 [2024-07-15 10:24:05.204502] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:28.128 [2024-07-15 10:24:05.204512] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fd1ff0 name raid_bdev1, state configuring 00:16:28.128 request: 00:16:28.128 { 00:16:28.128 "name": "raid_bdev1", 00:16:28.128 "raid_level": "concat", 00:16:28.128 "base_bdevs": [ 00:16:28.128 "malloc1", 00:16:28.128 "malloc2", 00:16:28.128 "malloc3" 00:16:28.128 ], 00:16:28.128 "strip_size_kb": 64, 00:16:28.128 "superblock": false, 00:16:28.128 "method": "bdev_raid_create", 00:16:28.128 "req_id": 1 00:16:28.128 } 00:16:28.128 Got JSON-RPC error response 00:16:28.128 response: 00:16:28.128 { 00:16:28.128 "code": -17, 00:16:28.128 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:28.128 } 00:16:28.128 10:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:28.128 10:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:28.128 10:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:28.128 10:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:28.128 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.128 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:28.386 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:28.386 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:28.386 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:28.645 [2024-07-15 10:24:05.696238] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:28.645 [2024-07-15 10:24:05.696285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:28.645 [2024-07-15 10:24:05.696306] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e2e7a0 00:16:28.645 [2024-07-15 10:24:05.696325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:28.645 [2024-07-15 10:24:05.697956] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:28.645 [2024-07-15 10:24:05.697983] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:28.645 [2024-07-15 10:24:05.698048] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:28.645 [2024-07-15 10:24:05.698076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:28.645 pt1 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:28.645 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:28.927 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.927 "name": "raid_bdev1", 00:16:28.927 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:28.927 "strip_size_kb": 64, 00:16:28.927 "state": "configuring", 00:16:28.927 "raid_level": "concat", 00:16:28.927 "superblock": true, 00:16:28.927 "num_base_bdevs": 3, 00:16:28.927 "num_base_bdevs_discovered": 1, 00:16:28.927 "num_base_bdevs_operational": 3, 00:16:28.927 "base_bdevs_list": [ 00:16:28.927 { 00:16:28.927 "name": "pt1", 00:16:28.927 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:28.927 "is_configured": true, 00:16:28.927 "data_offset": 2048, 00:16:28.927 "data_size": 63488 00:16:28.927 }, 00:16:28.927 { 00:16:28.927 "name": null, 00:16:28.927 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:28.927 "is_configured": false, 00:16:28.927 "data_offset": 2048, 00:16:28.927 "data_size": 63488 00:16:28.927 }, 00:16:28.927 { 00:16:28.927 "name": null, 00:16:28.927 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:28.927 "is_configured": false, 00:16:28.927 "data_offset": 2048, 00:16:28.927 "data_size": 63488 00:16:28.927 } 00:16:28.927 ] 00:16:28.927 }' 00:16:28.927 10:24:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.927 10:24:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.492 10:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:29.492 10:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:29.751 [2024-07-15 10:24:06.771098] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:29.751 [2024-07-15 10:24:06.771146] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:29.751 [2024-07-15 10:24:06.771165] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e25c70 00:16:29.751 [2024-07-15 10:24:06.771177] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:29.751 [2024-07-15 10:24:06.771519] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:29.751 [2024-07-15 10:24:06.771536] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:29.751 [2024-07-15 10:24:06.771598] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:29.751 [2024-07-15 10:24:06.771617] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:29.751 pt2 00:16:29.751 10:24:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:30.009 [2024-07-15 10:24:07.015757] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.009 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:30.267 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:30.267 "name": "raid_bdev1", 00:16:30.267 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:30.267 "strip_size_kb": 64, 00:16:30.267 "state": "configuring", 00:16:30.267 "raid_level": "concat", 00:16:30.267 "superblock": true, 00:16:30.267 "num_base_bdevs": 3, 00:16:30.267 "num_base_bdevs_discovered": 1, 00:16:30.267 "num_base_bdevs_operational": 3, 00:16:30.267 "base_bdevs_list": [ 00:16:30.267 { 00:16:30.267 "name": "pt1", 00:16:30.267 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:30.267 "is_configured": true, 00:16:30.267 "data_offset": 2048, 00:16:30.267 "data_size": 63488 00:16:30.267 }, 00:16:30.267 { 00:16:30.267 "name": null, 00:16:30.267 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:30.267 "is_configured": false, 00:16:30.267 "data_offset": 2048, 00:16:30.267 "data_size": 63488 00:16:30.267 }, 00:16:30.267 { 00:16:30.267 "name": null, 00:16:30.267 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:30.267 "is_configured": false, 00:16:30.267 "data_offset": 2048, 00:16:30.267 "data_size": 63488 00:16:30.267 } 00:16:30.267 ] 00:16:30.267 }' 00:16:30.267 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:30.267 10:24:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.833 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:30.833 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:30.833 10:24:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:31.092 [2024-07-15 10:24:08.126723] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:31.092 [2024-07-15 10:24:08.126777] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:31.092 [2024-07-15 10:24:08.126799] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e2ea10 00:16:31.092 [2024-07-15 10:24:08.126812] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:31.092 [2024-07-15 10:24:08.127162] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:31.092 [2024-07-15 10:24:08.127181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:31.092 [2024-07-15 10:24:08.127247] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:31.092 [2024-07-15 10:24:08.127267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:31.092 pt2 00:16:31.092 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:31.092 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:31.092 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:31.350 [2024-07-15 10:24:08.371371] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:31.350 [2024-07-15 10:24:08.371417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:31.350 [2024-07-15 10:24:08.371435] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fc8740 00:16:31.350 [2024-07-15 10:24:08.371448] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:31.350 [2024-07-15 10:24:08.371779] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:31.350 [2024-07-15 10:24:08.371795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:31.350 [2024-07-15 10:24:08.371857] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:31.350 [2024-07-15 10:24:08.371876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:31.350 [2024-07-15 10:24:08.372000] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fc8c00 00:16:31.350 [2024-07-15 10:24:08.372011] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:31.350 [2024-07-15 10:24:08.372182] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e2da40 00:16:31.350 [2024-07-15 10:24:08.372309] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fc8c00 00:16:31.350 [2024-07-15 10:24:08.372319] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fc8c00 00:16:31.350 [2024-07-15 10:24:08.372412] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:31.350 pt3 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.350 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:31.609 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.609 "name": "raid_bdev1", 00:16:31.609 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:31.609 "strip_size_kb": 64, 00:16:31.609 "state": "online", 00:16:31.609 "raid_level": "concat", 00:16:31.609 "superblock": true, 00:16:31.609 "num_base_bdevs": 3, 00:16:31.609 "num_base_bdevs_discovered": 3, 00:16:31.609 "num_base_bdevs_operational": 3, 00:16:31.609 "base_bdevs_list": [ 00:16:31.609 { 00:16:31.609 "name": "pt1", 00:16:31.609 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:31.609 "is_configured": true, 00:16:31.609 "data_offset": 2048, 00:16:31.609 "data_size": 63488 00:16:31.609 }, 00:16:31.609 { 00:16:31.609 "name": "pt2", 00:16:31.609 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:31.609 "is_configured": true, 00:16:31.609 "data_offset": 2048, 00:16:31.609 "data_size": 63488 00:16:31.609 }, 00:16:31.609 { 00:16:31.609 "name": "pt3", 00:16:31.609 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:31.609 "is_configured": true, 00:16:31.609 "data_offset": 2048, 00:16:31.609 "data_size": 63488 00:16:31.609 } 00:16:31.609 ] 00:16:31.609 }' 00:16:31.609 10:24:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.609 10:24:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:32.174 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:32.431 [2024-07-15 10:24:09.482599] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:32.431 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:32.431 "name": "raid_bdev1", 00:16:32.431 "aliases": [ 00:16:32.431 "c17d33be-aa6a-4c37-8856-62e430d4f73d" 00:16:32.431 ], 00:16:32.431 "product_name": "Raid Volume", 00:16:32.431 "block_size": 512, 00:16:32.431 "num_blocks": 190464, 00:16:32.431 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:32.431 "assigned_rate_limits": { 00:16:32.431 "rw_ios_per_sec": 0, 00:16:32.431 "rw_mbytes_per_sec": 0, 00:16:32.431 "r_mbytes_per_sec": 0, 00:16:32.431 "w_mbytes_per_sec": 0 00:16:32.431 }, 00:16:32.431 "claimed": false, 00:16:32.431 "zoned": false, 00:16:32.431 "supported_io_types": { 00:16:32.431 "read": true, 00:16:32.431 "write": true, 00:16:32.431 "unmap": true, 00:16:32.431 "flush": true, 00:16:32.431 "reset": true, 00:16:32.431 "nvme_admin": false, 00:16:32.431 "nvme_io": false, 00:16:32.431 "nvme_io_md": false, 00:16:32.431 "write_zeroes": true, 00:16:32.431 "zcopy": false, 00:16:32.431 "get_zone_info": false, 00:16:32.431 "zone_management": false, 00:16:32.431 "zone_append": false, 00:16:32.431 "compare": false, 00:16:32.431 "compare_and_write": false, 00:16:32.431 "abort": false, 00:16:32.431 "seek_hole": false, 00:16:32.431 "seek_data": false, 00:16:32.431 "copy": false, 00:16:32.431 "nvme_iov_md": false 00:16:32.431 }, 00:16:32.431 "memory_domains": [ 00:16:32.431 { 00:16:32.431 "dma_device_id": "system", 00:16:32.431 "dma_device_type": 1 00:16:32.431 }, 00:16:32.431 { 00:16:32.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.431 "dma_device_type": 2 00:16:32.431 }, 00:16:32.431 { 00:16:32.431 "dma_device_id": "system", 00:16:32.431 "dma_device_type": 1 00:16:32.431 }, 00:16:32.431 { 00:16:32.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.431 "dma_device_type": 2 00:16:32.431 }, 00:16:32.431 { 00:16:32.431 "dma_device_id": "system", 00:16:32.431 "dma_device_type": 1 00:16:32.431 }, 00:16:32.431 { 00:16:32.431 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.431 "dma_device_type": 2 00:16:32.431 } 00:16:32.431 ], 00:16:32.431 "driver_specific": { 00:16:32.431 "raid": { 00:16:32.431 "uuid": "c17d33be-aa6a-4c37-8856-62e430d4f73d", 00:16:32.431 "strip_size_kb": 64, 00:16:32.431 "state": "online", 00:16:32.431 "raid_level": "concat", 00:16:32.431 "superblock": true, 00:16:32.431 "num_base_bdevs": 3, 00:16:32.431 "num_base_bdevs_discovered": 3, 00:16:32.431 "num_base_bdevs_operational": 3, 00:16:32.431 "base_bdevs_list": [ 00:16:32.431 { 00:16:32.431 "name": "pt1", 00:16:32.431 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:32.431 "is_configured": true, 00:16:32.431 "data_offset": 2048, 00:16:32.431 "data_size": 63488 00:16:32.431 }, 00:16:32.431 { 00:16:32.431 "name": "pt2", 00:16:32.431 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:32.431 "is_configured": true, 00:16:32.431 "data_offset": 2048, 00:16:32.431 "data_size": 63488 00:16:32.431 }, 00:16:32.431 { 00:16:32.431 "name": "pt3", 00:16:32.431 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:32.431 "is_configured": true, 00:16:32.432 "data_offset": 2048, 00:16:32.432 "data_size": 63488 00:16:32.432 } 00:16:32.432 ] 00:16:32.432 } 00:16:32.432 } 00:16:32.432 }' 00:16:32.432 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:32.432 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:32.432 pt2 00:16:32.432 pt3' 00:16:32.432 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.432 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:32.432 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:32.689 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:32.689 "name": "pt1", 00:16:32.689 "aliases": [ 00:16:32.689 "00000000-0000-0000-0000-000000000001" 00:16:32.689 ], 00:16:32.689 "product_name": "passthru", 00:16:32.689 "block_size": 512, 00:16:32.689 "num_blocks": 65536, 00:16:32.689 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:32.689 "assigned_rate_limits": { 00:16:32.689 "rw_ios_per_sec": 0, 00:16:32.689 "rw_mbytes_per_sec": 0, 00:16:32.689 "r_mbytes_per_sec": 0, 00:16:32.689 "w_mbytes_per_sec": 0 00:16:32.689 }, 00:16:32.689 "claimed": true, 00:16:32.689 "claim_type": "exclusive_write", 00:16:32.689 "zoned": false, 00:16:32.689 "supported_io_types": { 00:16:32.689 "read": true, 00:16:32.689 "write": true, 00:16:32.689 "unmap": true, 00:16:32.689 "flush": true, 00:16:32.689 "reset": true, 00:16:32.689 "nvme_admin": false, 00:16:32.689 "nvme_io": false, 00:16:32.689 "nvme_io_md": false, 00:16:32.689 "write_zeroes": true, 00:16:32.689 "zcopy": true, 00:16:32.689 "get_zone_info": false, 00:16:32.689 "zone_management": false, 00:16:32.689 "zone_append": false, 00:16:32.689 "compare": false, 00:16:32.689 "compare_and_write": false, 00:16:32.689 "abort": true, 00:16:32.689 "seek_hole": false, 00:16:32.689 "seek_data": false, 00:16:32.689 "copy": true, 00:16:32.689 "nvme_iov_md": false 00:16:32.689 }, 00:16:32.689 "memory_domains": [ 00:16:32.689 { 00:16:32.689 "dma_device_id": "system", 00:16:32.689 "dma_device_type": 1 00:16:32.689 }, 00:16:32.689 { 00:16:32.689 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.689 "dma_device_type": 2 00:16:32.689 } 00:16:32.689 ], 00:16:32.689 "driver_specific": { 00:16:32.689 "passthru": { 00:16:32.689 "name": "pt1", 00:16:32.689 "base_bdev_name": "malloc1" 00:16:32.689 } 00:16:32.689 } 00:16:32.689 }' 00:16:32.689 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.689 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.946 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.946 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.946 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.946 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.946 10:24:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:32.946 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.203 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.203 "name": "pt2", 00:16:33.204 "aliases": [ 00:16:33.204 "00000000-0000-0000-0000-000000000002" 00:16:33.204 ], 00:16:33.204 "product_name": "passthru", 00:16:33.204 "block_size": 512, 00:16:33.204 "num_blocks": 65536, 00:16:33.204 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:33.204 "assigned_rate_limits": { 00:16:33.204 "rw_ios_per_sec": 0, 00:16:33.204 "rw_mbytes_per_sec": 0, 00:16:33.204 "r_mbytes_per_sec": 0, 00:16:33.204 "w_mbytes_per_sec": 0 00:16:33.204 }, 00:16:33.204 "claimed": true, 00:16:33.204 "claim_type": "exclusive_write", 00:16:33.204 "zoned": false, 00:16:33.204 "supported_io_types": { 00:16:33.204 "read": true, 00:16:33.204 "write": true, 00:16:33.204 "unmap": true, 00:16:33.204 "flush": true, 00:16:33.204 "reset": true, 00:16:33.204 "nvme_admin": false, 00:16:33.204 "nvme_io": false, 00:16:33.204 "nvme_io_md": false, 00:16:33.204 "write_zeroes": true, 00:16:33.204 "zcopy": true, 00:16:33.204 "get_zone_info": false, 00:16:33.204 "zone_management": false, 00:16:33.204 "zone_append": false, 00:16:33.204 "compare": false, 00:16:33.204 "compare_and_write": false, 00:16:33.204 "abort": true, 00:16:33.204 "seek_hole": false, 00:16:33.204 "seek_data": false, 00:16:33.204 "copy": true, 00:16:33.204 "nvme_iov_md": false 00:16:33.204 }, 00:16:33.204 "memory_domains": [ 00:16:33.204 { 00:16:33.204 "dma_device_id": "system", 00:16:33.204 "dma_device_type": 1 00:16:33.204 }, 00:16:33.204 { 00:16:33.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.204 "dma_device_type": 2 00:16:33.204 } 00:16:33.204 ], 00:16:33.204 "driver_specific": { 00:16:33.204 "passthru": { 00:16:33.204 "name": "pt2", 00:16:33.204 "base_bdev_name": "malloc2" 00:16:33.204 } 00:16:33.204 } 00:16:33.204 }' 00:16:33.204 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.460 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.716 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.716 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.716 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:33.716 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:33.716 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.972 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.972 "name": "pt3", 00:16:33.972 "aliases": [ 00:16:33.972 "00000000-0000-0000-0000-000000000003" 00:16:33.972 ], 00:16:33.972 "product_name": "passthru", 00:16:33.972 "block_size": 512, 00:16:33.972 "num_blocks": 65536, 00:16:33.972 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:33.972 "assigned_rate_limits": { 00:16:33.972 "rw_ios_per_sec": 0, 00:16:33.972 "rw_mbytes_per_sec": 0, 00:16:33.972 "r_mbytes_per_sec": 0, 00:16:33.972 "w_mbytes_per_sec": 0 00:16:33.972 }, 00:16:33.972 "claimed": true, 00:16:33.972 "claim_type": "exclusive_write", 00:16:33.972 "zoned": false, 00:16:33.972 "supported_io_types": { 00:16:33.972 "read": true, 00:16:33.972 "write": true, 00:16:33.972 "unmap": true, 00:16:33.972 "flush": true, 00:16:33.972 "reset": true, 00:16:33.972 "nvme_admin": false, 00:16:33.972 "nvme_io": false, 00:16:33.972 "nvme_io_md": false, 00:16:33.972 "write_zeroes": true, 00:16:33.972 "zcopy": true, 00:16:33.972 "get_zone_info": false, 00:16:33.972 "zone_management": false, 00:16:33.972 "zone_append": false, 00:16:33.972 "compare": false, 00:16:33.972 "compare_and_write": false, 00:16:33.972 "abort": true, 00:16:33.973 "seek_hole": false, 00:16:33.973 "seek_data": false, 00:16:33.973 "copy": true, 00:16:33.973 "nvme_iov_md": false 00:16:33.973 }, 00:16:33.973 "memory_domains": [ 00:16:33.973 { 00:16:33.973 "dma_device_id": "system", 00:16:33.973 "dma_device_type": 1 00:16:33.973 }, 00:16:33.973 { 00:16:33.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.973 "dma_device_type": 2 00:16:33.973 } 00:16:33.973 ], 00:16:33.973 "driver_specific": { 00:16:33.973 "passthru": { 00:16:33.973 "name": "pt3", 00:16:33.973 "base_bdev_name": "malloc3" 00:16:33.973 } 00:16:33.973 } 00:16:33.973 }' 00:16:33.973 10:24:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.973 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.973 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.973 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.973 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:34.229 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:34.486 [2024-07-15 10:24:11.580138] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c17d33be-aa6a-4c37-8856-62e430d4f73d '!=' c17d33be-aa6a-4c37-8856-62e430d4f73d ']' 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 516466 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 516466 ']' 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 516466 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:34.486 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 516466 00:16:34.487 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:34.487 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:34.487 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 516466' 00:16:34.487 killing process with pid 516466 00:16:34.487 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 516466 00:16:34.487 [2024-07-15 10:24:11.652798] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:34.487 [2024-07-15 10:24:11.652856] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:34.487 [2024-07-15 10:24:11.652915] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:34.487 [2024-07-15 10:24:11.652939] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fc8c00 name raid_bdev1, state offline 00:16:34.487 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 516466 00:16:34.487 [2024-07-15 10:24:11.682216] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:34.743 10:24:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:34.743 00:16:34.743 real 0m14.714s 00:16:34.743 user 0m26.466s 00:16:34.743 sys 0m2.665s 00:16:34.743 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:34.743 10:24:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.743 ************************************ 00:16:34.743 END TEST raid_superblock_test 00:16:34.743 ************************************ 00:16:34.743 10:24:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:34.743 10:24:11 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:16:34.743 10:24:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:34.743 10:24:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:34.743 10:24:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:34.998 ************************************ 00:16:34.998 START TEST raid_read_error_test 00:16:34.998 ************************************ 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.hvoX1DQLnc 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=519213 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 519213 /var/tmp/spdk-raid.sock 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 519213 ']' 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:34.998 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:34.998 10:24:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:34.999 [2024-07-15 10:24:12.046125] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:16:34.999 [2024-07-15 10:24:12.046189] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid519213 ] 00:16:34.999 [2024-07-15 10:24:12.162680] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:35.255 [2024-07-15 10:24:12.266878] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:35.255 [2024-07-15 10:24:12.331623] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:35.255 [2024-07-15 10:24:12.331672] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:35.958 10:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:35.958 10:24:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:35.958 10:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:35.959 10:24:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:36.217 BaseBdev1_malloc 00:16:36.217 10:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:36.474 true 00:16:36.474 10:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:36.731 [2024-07-15 10:24:13.701271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:36.731 [2024-07-15 10:24:13.701315] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:36.731 [2024-07-15 10:24:13.701337] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x144f0d0 00:16:36.731 [2024-07-15 10:24:13.701350] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:36.731 [2024-07-15 10:24:13.703207] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:36.731 [2024-07-15 10:24:13.703235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:36.731 BaseBdev1 00:16:36.731 10:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:36.731 10:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:36.988 BaseBdev2_malloc 00:16:36.988 10:24:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:37.246 true 00:16:37.247 10:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:37.247 [2024-07-15 10:24:14.439800] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:37.247 [2024-07-15 10:24:14.439846] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:37.247 [2024-07-15 10:24:14.439869] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1453910 00:16:37.247 [2024-07-15 10:24:14.439882] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:37.247 [2024-07-15 10:24:14.441525] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:37.247 [2024-07-15 10:24:14.441556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:37.247 BaseBdev2 00:16:37.505 10:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:37.505 10:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:37.505 BaseBdev3_malloc 00:16:37.763 10:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:37.763 true 00:16:37.763 10:24:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:38.020 [2024-07-15 10:24:15.183622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:38.020 [2024-07-15 10:24:15.183668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:38.020 [2024-07-15 10:24:15.183689] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1455bd0 00:16:38.020 [2024-07-15 10:24:15.183702] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:38.020 [2024-07-15 10:24:15.185214] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:38.020 [2024-07-15 10:24:15.185243] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:38.020 BaseBdev3 00:16:38.278 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:38.278 [2024-07-15 10:24:15.444369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:38.278 [2024-07-15 10:24:15.445695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:38.278 [2024-07-15 10:24:15.445765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:38.278 [2024-07-15 10:24:15.445981] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1457280 00:16:38.278 [2024-07-15 10:24:15.445993] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:38.278 [2024-07-15 10:24:15.446187] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1456e20 00:16:38.278 [2024-07-15 10:24:15.446335] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1457280 00:16:38.278 [2024-07-15 10:24:15.446345] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1457280 00:16:38.278 [2024-07-15 10:24:15.446449] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:38.278 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:38.278 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:38.278 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:38.278 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:38.278 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:38.278 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:38.279 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.279 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.279 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.279 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.537 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.537 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:38.537 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:38.537 "name": "raid_bdev1", 00:16:38.537 "uuid": "12a87396-e385-44ad-ac80-669d6515b3b2", 00:16:38.537 "strip_size_kb": 64, 00:16:38.537 "state": "online", 00:16:38.537 "raid_level": "concat", 00:16:38.537 "superblock": true, 00:16:38.537 "num_base_bdevs": 3, 00:16:38.537 "num_base_bdevs_discovered": 3, 00:16:38.537 "num_base_bdevs_operational": 3, 00:16:38.537 "base_bdevs_list": [ 00:16:38.537 { 00:16:38.537 "name": "BaseBdev1", 00:16:38.537 "uuid": "c4592939-f610-5913-8322-993bb78dc624", 00:16:38.537 "is_configured": true, 00:16:38.537 "data_offset": 2048, 00:16:38.537 "data_size": 63488 00:16:38.537 }, 00:16:38.537 { 00:16:38.537 "name": "BaseBdev2", 00:16:38.537 "uuid": "642ecf1c-6edf-5af9-8033-d2d7b3a49789", 00:16:38.537 "is_configured": true, 00:16:38.537 "data_offset": 2048, 00:16:38.537 "data_size": 63488 00:16:38.537 }, 00:16:38.537 { 00:16:38.537 "name": "BaseBdev3", 00:16:38.537 "uuid": "400a2674-510b-5ebe-a337-4bd4b85ed820", 00:16:38.537 "is_configured": true, 00:16:38.537 "data_offset": 2048, 00:16:38.537 "data_size": 63488 00:16:38.537 } 00:16:38.537 ] 00:16:38.537 }' 00:16:38.537 10:24:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:38.537 10:24:15 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.470 10:24:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:39.470 10:24:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:39.470 [2024-07-15 10:24:16.415226] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a54d0 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.404 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:40.662 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.662 "name": "raid_bdev1", 00:16:40.662 "uuid": "12a87396-e385-44ad-ac80-669d6515b3b2", 00:16:40.662 "strip_size_kb": 64, 00:16:40.662 "state": "online", 00:16:40.662 "raid_level": "concat", 00:16:40.662 "superblock": true, 00:16:40.662 "num_base_bdevs": 3, 00:16:40.662 "num_base_bdevs_discovered": 3, 00:16:40.662 "num_base_bdevs_operational": 3, 00:16:40.662 "base_bdevs_list": [ 00:16:40.662 { 00:16:40.662 "name": "BaseBdev1", 00:16:40.662 "uuid": "c4592939-f610-5913-8322-993bb78dc624", 00:16:40.662 "is_configured": true, 00:16:40.662 "data_offset": 2048, 00:16:40.662 "data_size": 63488 00:16:40.662 }, 00:16:40.662 { 00:16:40.662 "name": "BaseBdev2", 00:16:40.662 "uuid": "642ecf1c-6edf-5af9-8033-d2d7b3a49789", 00:16:40.662 "is_configured": true, 00:16:40.662 "data_offset": 2048, 00:16:40.662 "data_size": 63488 00:16:40.662 }, 00:16:40.662 { 00:16:40.662 "name": "BaseBdev3", 00:16:40.662 "uuid": "400a2674-510b-5ebe-a337-4bd4b85ed820", 00:16:40.662 "is_configured": true, 00:16:40.662 "data_offset": 2048, 00:16:40.662 "data_size": 63488 00:16:40.662 } 00:16:40.662 ] 00:16:40.662 }' 00:16:40.662 10:24:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.662 10:24:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.228 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:41.487 [2024-07-15 10:24:18.611324] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:41.487 [2024-07-15 10:24:18.611356] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:41.487 [2024-07-15 10:24:18.614533] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:41.487 [2024-07-15 10:24:18.614572] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:41.487 [2024-07-15 10:24:18.614608] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:41.487 [2024-07-15 10:24:18.614619] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1457280 name raid_bdev1, state offline 00:16:41.487 0 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 519213 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 519213 ']' 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 519213 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 519213 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 519213' 00:16:41.487 killing process with pid 519213 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 519213 00:16:41.487 [2024-07-15 10:24:18.680594] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:41.487 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 519213 00:16:41.745 [2024-07-15 10:24:18.701457] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.hvoX1DQLnc 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:16:41.745 00:16:41.745 real 0m6.948s 00:16:41.745 user 0m10.950s 00:16:41.745 sys 0m1.256s 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:41.745 10:24:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.745 ************************************ 00:16:41.745 END TEST raid_read_error_test 00:16:41.745 ************************************ 00:16:42.002 10:24:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:42.002 10:24:18 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:16:42.002 10:24:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:42.002 10:24:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:42.002 10:24:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:42.002 ************************************ 00:16:42.002 START TEST raid_write_error_test 00:16:42.002 ************************************ 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RuA8aAL7tb 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=520196 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 520196 /var/tmp/spdk-raid.sock 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 520196 ']' 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:42.002 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:42.002 10:24:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.002 [2024-07-15 10:24:19.075386] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:16:42.002 [2024-07-15 10:24:19.075451] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid520196 ] 00:16:42.260 [2024-07-15 10:24:19.201331] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:42.260 [2024-07-15 10:24:19.302934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:42.260 [2024-07-15 10:24:19.363785] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:42.260 [2024-07-15 10:24:19.363833] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:42.825 10:24:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:42.825 10:24:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:42.825 10:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:42.825 10:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:43.082 BaseBdev1_malloc 00:16:43.082 10:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:43.339 true 00:16:43.339 10:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:43.597 [2024-07-15 10:24:20.722260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:43.597 [2024-07-15 10:24:20.722307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:43.597 [2024-07-15 10:24:20.722330] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23040d0 00:16:43.597 [2024-07-15 10:24:20.722343] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:43.597 [2024-07-15 10:24:20.724241] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:43.597 [2024-07-15 10:24:20.724276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:43.597 BaseBdev1 00:16:43.597 10:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:43.598 10:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:43.856 BaseBdev2_malloc 00:16:43.856 10:24:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:44.114 true 00:16:44.114 10:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:44.371 [2024-07-15 10:24:21.448806] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:44.371 [2024-07-15 10:24:21.448855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:44.371 [2024-07-15 10:24:21.448877] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2308910 00:16:44.371 [2024-07-15 10:24:21.448890] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:44.371 [2024-07-15 10:24:21.450541] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:44.371 [2024-07-15 10:24:21.450570] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:44.371 BaseBdev2 00:16:44.371 10:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:44.371 10:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:44.628 BaseBdev3_malloc 00:16:44.628 10:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:44.886 true 00:16:44.886 10:24:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:45.144 [2024-07-15 10:24:22.172522] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:45.145 [2024-07-15 10:24:22.172567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:45.145 [2024-07-15 10:24:22.172589] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x230abd0 00:16:45.145 [2024-07-15 10:24:22.172602] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:45.145 [2024-07-15 10:24:22.174185] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:45.145 [2024-07-15 10:24:22.174212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:45.145 BaseBdev3 00:16:45.145 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:45.402 [2024-07-15 10:24:22.413192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:45.402 [2024-07-15 10:24:22.414539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:45.402 [2024-07-15 10:24:22.414609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:45.402 [2024-07-15 10:24:22.414816] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x230c280 00:16:45.402 [2024-07-15 10:24:22.414828] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:45.402 [2024-07-15 10:24:22.415035] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x230be20 00:16:45.402 [2024-07-15 10:24:22.415185] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x230c280 00:16:45.402 [2024-07-15 10:24:22.415196] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x230c280 00:16:45.402 [2024-07-15 10:24:22.415309] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.402 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.403 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.403 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.403 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.403 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:45.661 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.661 "name": "raid_bdev1", 00:16:45.661 "uuid": "a6e38a78-88c1-4a37-aa53-c575d5f8678e", 00:16:45.661 "strip_size_kb": 64, 00:16:45.661 "state": "online", 00:16:45.661 "raid_level": "concat", 00:16:45.661 "superblock": true, 00:16:45.661 "num_base_bdevs": 3, 00:16:45.661 "num_base_bdevs_discovered": 3, 00:16:45.661 "num_base_bdevs_operational": 3, 00:16:45.661 "base_bdevs_list": [ 00:16:45.661 { 00:16:45.661 "name": "BaseBdev1", 00:16:45.661 "uuid": "d296b21f-d34b-5d86-a311-ab34a007e619", 00:16:45.661 "is_configured": true, 00:16:45.661 "data_offset": 2048, 00:16:45.661 "data_size": 63488 00:16:45.661 }, 00:16:45.661 { 00:16:45.661 "name": "BaseBdev2", 00:16:45.661 "uuid": "b2d20ede-0ccd-5f61-9fe3-d8335b97936f", 00:16:45.661 "is_configured": true, 00:16:45.661 "data_offset": 2048, 00:16:45.661 "data_size": 63488 00:16:45.661 }, 00:16:45.661 { 00:16:45.661 "name": "BaseBdev3", 00:16:45.661 "uuid": "bb7acf04-c57a-586d-b56e-5f2e4f9125c0", 00:16:45.661 "is_configured": true, 00:16:45.661 "data_offset": 2048, 00:16:45.661 "data_size": 63488 00:16:45.661 } 00:16:45.661 ] 00:16:45.661 }' 00:16:45.661 10:24:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.661 10:24:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.239 10:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:46.239 10:24:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:46.239 [2024-07-15 10:24:23.363993] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x215a4d0 00:16:47.173 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.431 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:47.690 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.690 "name": "raid_bdev1", 00:16:47.690 "uuid": "a6e38a78-88c1-4a37-aa53-c575d5f8678e", 00:16:47.690 "strip_size_kb": 64, 00:16:47.690 "state": "online", 00:16:47.690 "raid_level": "concat", 00:16:47.690 "superblock": true, 00:16:47.690 "num_base_bdevs": 3, 00:16:47.690 "num_base_bdevs_discovered": 3, 00:16:47.690 "num_base_bdevs_operational": 3, 00:16:47.690 "base_bdevs_list": [ 00:16:47.690 { 00:16:47.690 "name": "BaseBdev1", 00:16:47.690 "uuid": "d296b21f-d34b-5d86-a311-ab34a007e619", 00:16:47.690 "is_configured": true, 00:16:47.690 "data_offset": 2048, 00:16:47.690 "data_size": 63488 00:16:47.690 }, 00:16:47.690 { 00:16:47.690 "name": "BaseBdev2", 00:16:47.690 "uuid": "b2d20ede-0ccd-5f61-9fe3-d8335b97936f", 00:16:47.690 "is_configured": true, 00:16:47.690 "data_offset": 2048, 00:16:47.690 "data_size": 63488 00:16:47.690 }, 00:16:47.690 { 00:16:47.690 "name": "BaseBdev3", 00:16:47.690 "uuid": "bb7acf04-c57a-586d-b56e-5f2e4f9125c0", 00:16:47.690 "is_configured": true, 00:16:47.690 "data_offset": 2048, 00:16:47.690 "data_size": 63488 00:16:47.690 } 00:16:47.690 ] 00:16:47.690 }' 00:16:47.690 10:24:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.690 10:24:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.255 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:48.512 [2024-07-15 10:24:25.585218] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:48.512 [2024-07-15 10:24:25.585255] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:48.512 [2024-07-15 10:24:25.588431] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:48.512 [2024-07-15 10:24:25.588469] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:48.512 [2024-07-15 10:24:25.588504] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:48.512 [2024-07-15 10:24:25.588517] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x230c280 name raid_bdev1, state offline 00:16:48.512 0 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 520196 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 520196 ']' 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 520196 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 520196 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 520196' 00:16:48.512 killing process with pid 520196 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 520196 00:16:48.512 [2024-07-15 10:24:25.652179] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:48.512 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 520196 00:16:48.512 [2024-07-15 10:24:25.673107] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RuA8aAL7tb 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:48.769 10:24:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:16:48.769 00:16:48.769 real 0m6.910s 00:16:48.770 user 0m10.946s 00:16:48.770 sys 0m1.216s 00:16:48.770 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:48.770 10:24:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.770 ************************************ 00:16:48.770 END TEST raid_write_error_test 00:16:48.770 ************************************ 00:16:48.770 10:24:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:48.770 10:24:25 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:48.770 10:24:25 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:48.770 10:24:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:48.770 10:24:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:48.770 10:24:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:49.028 ************************************ 00:16:49.028 START TEST raid_state_function_test 00:16:49.028 ************************************ 00:16:49.028 10:24:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:49.028 10:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:49.028 10:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:49.028 10:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:49.028 10:24:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=521172 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 521172' 00:16:49.028 Process raid pid: 521172 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 521172 /var/tmp/spdk-raid.sock 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 521172 ']' 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:49.028 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:49.028 10:24:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.028 [2024-07-15 10:24:26.068422] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:16:49.028 [2024-07-15 10:24:26.068492] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:49.028 [2024-07-15 10:24:26.200634] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:49.286 [2024-07-15 10:24:26.299028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:49.286 [2024-07-15 10:24:26.362687] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:49.286 [2024-07-15 10:24:26.362726] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:49.850 10:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:49.850 10:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:49.850 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:50.108 [2024-07-15 10:24:27.157335] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:50.108 [2024-07-15 10:24:27.157383] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:50.108 [2024-07-15 10:24:27.157394] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:50.108 [2024-07-15 10:24:27.157406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:50.108 [2024-07-15 10:24:27.157415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:50.108 [2024-07-15 10:24:27.157426] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.108 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.365 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.365 "name": "Existed_Raid", 00:16:50.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.366 "strip_size_kb": 0, 00:16:50.366 "state": "configuring", 00:16:50.366 "raid_level": "raid1", 00:16:50.366 "superblock": false, 00:16:50.366 "num_base_bdevs": 3, 00:16:50.366 "num_base_bdevs_discovered": 0, 00:16:50.366 "num_base_bdevs_operational": 3, 00:16:50.366 "base_bdevs_list": [ 00:16:50.366 { 00:16:50.366 "name": "BaseBdev1", 00:16:50.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.366 "is_configured": false, 00:16:50.366 "data_offset": 0, 00:16:50.366 "data_size": 0 00:16:50.366 }, 00:16:50.366 { 00:16:50.366 "name": "BaseBdev2", 00:16:50.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.366 "is_configured": false, 00:16:50.366 "data_offset": 0, 00:16:50.366 "data_size": 0 00:16:50.366 }, 00:16:50.366 { 00:16:50.366 "name": "BaseBdev3", 00:16:50.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.366 "is_configured": false, 00:16:50.366 "data_offset": 0, 00:16:50.366 "data_size": 0 00:16:50.366 } 00:16:50.366 ] 00:16:50.366 }' 00:16:50.366 10:24:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.366 10:24:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.929 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:51.186 [2024-07-15 10:24:28.260124] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:51.186 [2024-07-15 10:24:28.260160] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1795a80 name Existed_Raid, state configuring 00:16:51.186 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:51.443 [2024-07-15 10:24:28.432590] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:51.443 [2024-07-15 10:24:28.432624] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:51.443 [2024-07-15 10:24:28.432635] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:51.443 [2024-07-15 10:24:28.432646] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:51.443 [2024-07-15 10:24:28.432655] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:51.443 [2024-07-15 10:24:28.432666] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:51.443 [2024-07-15 10:24:28.618976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:51.443 BaseBdev1 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:51.443 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:51.701 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:51.957 [ 00:16:51.957 { 00:16:51.957 "name": "BaseBdev1", 00:16:51.957 "aliases": [ 00:16:51.957 "5335f390-0515-492b-9b76-d12f59657b4e" 00:16:51.957 ], 00:16:51.957 "product_name": "Malloc disk", 00:16:51.957 "block_size": 512, 00:16:51.958 "num_blocks": 65536, 00:16:51.958 "uuid": "5335f390-0515-492b-9b76-d12f59657b4e", 00:16:51.958 "assigned_rate_limits": { 00:16:51.958 "rw_ios_per_sec": 0, 00:16:51.958 "rw_mbytes_per_sec": 0, 00:16:51.958 "r_mbytes_per_sec": 0, 00:16:51.958 "w_mbytes_per_sec": 0 00:16:51.958 }, 00:16:51.958 "claimed": true, 00:16:51.958 "claim_type": "exclusive_write", 00:16:51.958 "zoned": false, 00:16:51.958 "supported_io_types": { 00:16:51.958 "read": true, 00:16:51.958 "write": true, 00:16:51.958 "unmap": true, 00:16:51.958 "flush": true, 00:16:51.958 "reset": true, 00:16:51.958 "nvme_admin": false, 00:16:51.958 "nvme_io": false, 00:16:51.958 "nvme_io_md": false, 00:16:51.958 "write_zeroes": true, 00:16:51.958 "zcopy": true, 00:16:51.958 "get_zone_info": false, 00:16:51.958 "zone_management": false, 00:16:51.958 "zone_append": false, 00:16:51.958 "compare": false, 00:16:51.958 "compare_and_write": false, 00:16:51.958 "abort": true, 00:16:51.958 "seek_hole": false, 00:16:51.958 "seek_data": false, 00:16:51.958 "copy": true, 00:16:51.958 "nvme_iov_md": false 00:16:51.958 }, 00:16:51.958 "memory_domains": [ 00:16:51.958 { 00:16:51.958 "dma_device_id": "system", 00:16:51.958 "dma_device_type": 1 00:16:51.958 }, 00:16:51.958 { 00:16:51.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.958 "dma_device_type": 2 00:16:51.958 } 00:16:51.958 ], 00:16:51.958 "driver_specific": {} 00:16:51.958 } 00:16:51.958 ] 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.958 10:24:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.231 10:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.231 "name": "Existed_Raid", 00:16:52.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:52.231 "strip_size_kb": 0, 00:16:52.231 "state": "configuring", 00:16:52.231 "raid_level": "raid1", 00:16:52.231 "superblock": false, 00:16:52.231 "num_base_bdevs": 3, 00:16:52.231 "num_base_bdevs_discovered": 1, 00:16:52.231 "num_base_bdevs_operational": 3, 00:16:52.231 "base_bdevs_list": [ 00:16:52.231 { 00:16:52.231 "name": "BaseBdev1", 00:16:52.231 "uuid": "5335f390-0515-492b-9b76-d12f59657b4e", 00:16:52.231 "is_configured": true, 00:16:52.231 "data_offset": 0, 00:16:52.231 "data_size": 65536 00:16:52.231 }, 00:16:52.231 { 00:16:52.231 "name": "BaseBdev2", 00:16:52.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:52.231 "is_configured": false, 00:16:52.231 "data_offset": 0, 00:16:52.231 "data_size": 0 00:16:52.231 }, 00:16:52.231 { 00:16:52.231 "name": "BaseBdev3", 00:16:52.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:52.231 "is_configured": false, 00:16:52.231 "data_offset": 0, 00:16:52.231 "data_size": 0 00:16:52.231 } 00:16:52.231 ] 00:16:52.231 }' 00:16:52.231 10:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.231 10:24:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.796 10:24:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:53.053 [2024-07-15 10:24:30.014660] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:53.053 [2024-07-15 10:24:30.014715] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1795310 name Existed_Raid, state configuring 00:16:53.053 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:53.310 [2024-07-15 10:24:30.259351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:53.310 [2024-07-15 10:24:30.260818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:53.310 [2024-07-15 10:24:30.260852] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:53.310 [2024-07-15 10:24:30.260863] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:53.310 [2024-07-15 10:24:30.260874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.310 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.568 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.568 "name": "Existed_Raid", 00:16:53.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.568 "strip_size_kb": 0, 00:16:53.568 "state": "configuring", 00:16:53.568 "raid_level": "raid1", 00:16:53.568 "superblock": false, 00:16:53.568 "num_base_bdevs": 3, 00:16:53.568 "num_base_bdevs_discovered": 1, 00:16:53.568 "num_base_bdevs_operational": 3, 00:16:53.568 "base_bdevs_list": [ 00:16:53.568 { 00:16:53.568 "name": "BaseBdev1", 00:16:53.568 "uuid": "5335f390-0515-492b-9b76-d12f59657b4e", 00:16:53.568 "is_configured": true, 00:16:53.568 "data_offset": 0, 00:16:53.568 "data_size": 65536 00:16:53.568 }, 00:16:53.568 { 00:16:53.568 "name": "BaseBdev2", 00:16:53.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.568 "is_configured": false, 00:16:53.568 "data_offset": 0, 00:16:53.568 "data_size": 0 00:16:53.568 }, 00:16:53.568 { 00:16:53.568 "name": "BaseBdev3", 00:16:53.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.568 "is_configured": false, 00:16:53.568 "data_offset": 0, 00:16:53.568 "data_size": 0 00:16:53.568 } 00:16:53.568 ] 00:16:53.568 }' 00:16:53.568 10:24:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.568 10:24:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:54.134 [2024-07-15 10:24:31.297507] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:54.134 BaseBdev2 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:54.134 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:54.392 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:54.649 [ 00:16:54.649 { 00:16:54.649 "name": "BaseBdev2", 00:16:54.649 "aliases": [ 00:16:54.649 "89b6aa2d-b333-46e8-8179-29193c154b2e" 00:16:54.649 ], 00:16:54.649 "product_name": "Malloc disk", 00:16:54.649 "block_size": 512, 00:16:54.649 "num_blocks": 65536, 00:16:54.649 "uuid": "89b6aa2d-b333-46e8-8179-29193c154b2e", 00:16:54.649 "assigned_rate_limits": { 00:16:54.649 "rw_ios_per_sec": 0, 00:16:54.649 "rw_mbytes_per_sec": 0, 00:16:54.649 "r_mbytes_per_sec": 0, 00:16:54.649 "w_mbytes_per_sec": 0 00:16:54.649 }, 00:16:54.649 "claimed": true, 00:16:54.649 "claim_type": "exclusive_write", 00:16:54.649 "zoned": false, 00:16:54.649 "supported_io_types": { 00:16:54.649 "read": true, 00:16:54.649 "write": true, 00:16:54.649 "unmap": true, 00:16:54.649 "flush": true, 00:16:54.649 "reset": true, 00:16:54.649 "nvme_admin": false, 00:16:54.649 "nvme_io": false, 00:16:54.649 "nvme_io_md": false, 00:16:54.649 "write_zeroes": true, 00:16:54.649 "zcopy": true, 00:16:54.649 "get_zone_info": false, 00:16:54.649 "zone_management": false, 00:16:54.649 "zone_append": false, 00:16:54.649 "compare": false, 00:16:54.649 "compare_and_write": false, 00:16:54.649 "abort": true, 00:16:54.649 "seek_hole": false, 00:16:54.649 "seek_data": false, 00:16:54.649 "copy": true, 00:16:54.649 "nvme_iov_md": false 00:16:54.649 }, 00:16:54.649 "memory_domains": [ 00:16:54.649 { 00:16:54.649 "dma_device_id": "system", 00:16:54.649 "dma_device_type": 1 00:16:54.649 }, 00:16:54.649 { 00:16:54.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:54.649 "dma_device_type": 2 00:16:54.649 } 00:16:54.649 ], 00:16:54.649 "driver_specific": {} 00:16:54.649 } 00:16:54.649 ] 00:16:54.649 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:54.649 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.650 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.907 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.907 "name": "Existed_Raid", 00:16:54.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.907 "strip_size_kb": 0, 00:16:54.907 "state": "configuring", 00:16:54.907 "raid_level": "raid1", 00:16:54.907 "superblock": false, 00:16:54.907 "num_base_bdevs": 3, 00:16:54.907 "num_base_bdevs_discovered": 2, 00:16:54.907 "num_base_bdevs_operational": 3, 00:16:54.907 "base_bdevs_list": [ 00:16:54.907 { 00:16:54.907 "name": "BaseBdev1", 00:16:54.907 "uuid": "5335f390-0515-492b-9b76-d12f59657b4e", 00:16:54.907 "is_configured": true, 00:16:54.907 "data_offset": 0, 00:16:54.907 "data_size": 65536 00:16:54.907 }, 00:16:54.907 { 00:16:54.907 "name": "BaseBdev2", 00:16:54.907 "uuid": "89b6aa2d-b333-46e8-8179-29193c154b2e", 00:16:54.907 "is_configured": true, 00:16:54.907 "data_offset": 0, 00:16:54.907 "data_size": 65536 00:16:54.907 }, 00:16:54.907 { 00:16:54.907 "name": "BaseBdev3", 00:16:54.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.907 "is_configured": false, 00:16:54.907 "data_offset": 0, 00:16:54.907 "data_size": 0 00:16:54.907 } 00:16:54.907 ] 00:16:54.907 }' 00:16:54.907 10:24:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.907 10:24:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.474 10:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:55.731 [2024-07-15 10:24:32.724744] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:55.731 [2024-07-15 10:24:32.724798] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1796400 00:16:55.731 [2024-07-15 10:24:32.724808] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:55.731 [2024-07-15 10:24:32.725059] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1795ef0 00:16:55.731 [2024-07-15 10:24:32.725187] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1796400 00:16:55.731 [2024-07-15 10:24:32.725197] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1796400 00:16:55.731 [2024-07-15 10:24:32.725362] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:55.731 BaseBdev3 00:16:55.731 10:24:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:55.731 10:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:55.731 10:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.731 10:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:55.731 10:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.731 10:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.731 10:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.988 10:24:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:56.245 [ 00:16:56.245 { 00:16:56.245 "name": "BaseBdev3", 00:16:56.245 "aliases": [ 00:16:56.245 "22e5632e-b4b5-4c93-93c2-c194235d8281" 00:16:56.245 ], 00:16:56.245 "product_name": "Malloc disk", 00:16:56.245 "block_size": 512, 00:16:56.245 "num_blocks": 65536, 00:16:56.245 "uuid": "22e5632e-b4b5-4c93-93c2-c194235d8281", 00:16:56.245 "assigned_rate_limits": { 00:16:56.245 "rw_ios_per_sec": 0, 00:16:56.245 "rw_mbytes_per_sec": 0, 00:16:56.245 "r_mbytes_per_sec": 0, 00:16:56.245 "w_mbytes_per_sec": 0 00:16:56.245 }, 00:16:56.245 "claimed": true, 00:16:56.245 "claim_type": "exclusive_write", 00:16:56.245 "zoned": false, 00:16:56.245 "supported_io_types": { 00:16:56.245 "read": true, 00:16:56.245 "write": true, 00:16:56.245 "unmap": true, 00:16:56.245 "flush": true, 00:16:56.245 "reset": true, 00:16:56.245 "nvme_admin": false, 00:16:56.245 "nvme_io": false, 00:16:56.245 "nvme_io_md": false, 00:16:56.245 "write_zeroes": true, 00:16:56.245 "zcopy": true, 00:16:56.245 "get_zone_info": false, 00:16:56.245 "zone_management": false, 00:16:56.245 "zone_append": false, 00:16:56.245 "compare": false, 00:16:56.245 "compare_and_write": false, 00:16:56.245 "abort": true, 00:16:56.245 "seek_hole": false, 00:16:56.245 "seek_data": false, 00:16:56.245 "copy": true, 00:16:56.245 "nvme_iov_md": false 00:16:56.245 }, 00:16:56.245 "memory_domains": [ 00:16:56.245 { 00:16:56.245 "dma_device_id": "system", 00:16:56.245 "dma_device_type": 1 00:16:56.245 }, 00:16:56.245 { 00:16:56.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.245 "dma_device_type": 2 00:16:56.245 } 00:16:56.245 ], 00:16:56.245 "driver_specific": {} 00:16:56.245 } 00:16:56.245 ] 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.245 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.503 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:56.503 "name": "Existed_Raid", 00:16:56.503 "uuid": "6d9a6a69-cc86-4479-a050-dcba6b6df603", 00:16:56.503 "strip_size_kb": 0, 00:16:56.503 "state": "online", 00:16:56.503 "raid_level": "raid1", 00:16:56.503 "superblock": false, 00:16:56.503 "num_base_bdevs": 3, 00:16:56.503 "num_base_bdevs_discovered": 3, 00:16:56.503 "num_base_bdevs_operational": 3, 00:16:56.503 "base_bdevs_list": [ 00:16:56.503 { 00:16:56.503 "name": "BaseBdev1", 00:16:56.503 "uuid": "5335f390-0515-492b-9b76-d12f59657b4e", 00:16:56.503 "is_configured": true, 00:16:56.503 "data_offset": 0, 00:16:56.503 "data_size": 65536 00:16:56.503 }, 00:16:56.503 { 00:16:56.503 "name": "BaseBdev2", 00:16:56.503 "uuid": "89b6aa2d-b333-46e8-8179-29193c154b2e", 00:16:56.503 "is_configured": true, 00:16:56.503 "data_offset": 0, 00:16:56.503 "data_size": 65536 00:16:56.503 }, 00:16:56.503 { 00:16:56.503 "name": "BaseBdev3", 00:16:56.503 "uuid": "22e5632e-b4b5-4c93-93c2-c194235d8281", 00:16:56.503 "is_configured": true, 00:16:56.503 "data_offset": 0, 00:16:56.503 "data_size": 65536 00:16:56.503 } 00:16:56.503 ] 00:16:56.503 }' 00:16:56.503 10:24:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:56.503 10:24:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:57.069 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:57.328 [2024-07-15 10:24:34.313293] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:57.328 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:57.328 "name": "Existed_Raid", 00:16:57.328 "aliases": [ 00:16:57.328 "6d9a6a69-cc86-4479-a050-dcba6b6df603" 00:16:57.328 ], 00:16:57.328 "product_name": "Raid Volume", 00:16:57.328 "block_size": 512, 00:16:57.328 "num_blocks": 65536, 00:16:57.328 "uuid": "6d9a6a69-cc86-4479-a050-dcba6b6df603", 00:16:57.328 "assigned_rate_limits": { 00:16:57.328 "rw_ios_per_sec": 0, 00:16:57.328 "rw_mbytes_per_sec": 0, 00:16:57.328 "r_mbytes_per_sec": 0, 00:16:57.328 "w_mbytes_per_sec": 0 00:16:57.328 }, 00:16:57.328 "claimed": false, 00:16:57.328 "zoned": false, 00:16:57.328 "supported_io_types": { 00:16:57.328 "read": true, 00:16:57.328 "write": true, 00:16:57.328 "unmap": false, 00:16:57.328 "flush": false, 00:16:57.328 "reset": true, 00:16:57.328 "nvme_admin": false, 00:16:57.328 "nvme_io": false, 00:16:57.328 "nvme_io_md": false, 00:16:57.328 "write_zeroes": true, 00:16:57.328 "zcopy": false, 00:16:57.328 "get_zone_info": false, 00:16:57.328 "zone_management": false, 00:16:57.328 "zone_append": false, 00:16:57.328 "compare": false, 00:16:57.328 "compare_and_write": false, 00:16:57.328 "abort": false, 00:16:57.328 "seek_hole": false, 00:16:57.328 "seek_data": false, 00:16:57.328 "copy": false, 00:16:57.328 "nvme_iov_md": false 00:16:57.328 }, 00:16:57.328 "memory_domains": [ 00:16:57.328 { 00:16:57.328 "dma_device_id": "system", 00:16:57.328 "dma_device_type": 1 00:16:57.328 }, 00:16:57.328 { 00:16:57.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.328 "dma_device_type": 2 00:16:57.328 }, 00:16:57.328 { 00:16:57.328 "dma_device_id": "system", 00:16:57.328 "dma_device_type": 1 00:16:57.328 }, 00:16:57.328 { 00:16:57.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.328 "dma_device_type": 2 00:16:57.328 }, 00:16:57.328 { 00:16:57.328 "dma_device_id": "system", 00:16:57.328 "dma_device_type": 1 00:16:57.328 }, 00:16:57.328 { 00:16:57.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.328 "dma_device_type": 2 00:16:57.328 } 00:16:57.328 ], 00:16:57.328 "driver_specific": { 00:16:57.328 "raid": { 00:16:57.328 "uuid": "6d9a6a69-cc86-4479-a050-dcba6b6df603", 00:16:57.328 "strip_size_kb": 0, 00:16:57.328 "state": "online", 00:16:57.328 "raid_level": "raid1", 00:16:57.328 "superblock": false, 00:16:57.328 "num_base_bdevs": 3, 00:16:57.328 "num_base_bdevs_discovered": 3, 00:16:57.328 "num_base_bdevs_operational": 3, 00:16:57.328 "base_bdevs_list": [ 00:16:57.328 { 00:16:57.328 "name": "BaseBdev1", 00:16:57.328 "uuid": "5335f390-0515-492b-9b76-d12f59657b4e", 00:16:57.328 "is_configured": true, 00:16:57.328 "data_offset": 0, 00:16:57.328 "data_size": 65536 00:16:57.328 }, 00:16:57.328 { 00:16:57.328 "name": "BaseBdev2", 00:16:57.328 "uuid": "89b6aa2d-b333-46e8-8179-29193c154b2e", 00:16:57.328 "is_configured": true, 00:16:57.328 "data_offset": 0, 00:16:57.328 "data_size": 65536 00:16:57.328 }, 00:16:57.328 { 00:16:57.328 "name": "BaseBdev3", 00:16:57.328 "uuid": "22e5632e-b4b5-4c93-93c2-c194235d8281", 00:16:57.328 "is_configured": true, 00:16:57.328 "data_offset": 0, 00:16:57.328 "data_size": 65536 00:16:57.328 } 00:16:57.328 ] 00:16:57.328 } 00:16:57.328 } 00:16:57.328 }' 00:16:57.328 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:57.328 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:57.328 BaseBdev2 00:16:57.328 BaseBdev3' 00:16:57.328 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:57.328 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:57.328 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:57.586 "name": "BaseBdev1", 00:16:57.586 "aliases": [ 00:16:57.586 "5335f390-0515-492b-9b76-d12f59657b4e" 00:16:57.586 ], 00:16:57.586 "product_name": "Malloc disk", 00:16:57.586 "block_size": 512, 00:16:57.586 "num_blocks": 65536, 00:16:57.586 "uuid": "5335f390-0515-492b-9b76-d12f59657b4e", 00:16:57.586 "assigned_rate_limits": { 00:16:57.586 "rw_ios_per_sec": 0, 00:16:57.586 "rw_mbytes_per_sec": 0, 00:16:57.586 "r_mbytes_per_sec": 0, 00:16:57.586 "w_mbytes_per_sec": 0 00:16:57.586 }, 00:16:57.586 "claimed": true, 00:16:57.586 "claim_type": "exclusive_write", 00:16:57.586 "zoned": false, 00:16:57.586 "supported_io_types": { 00:16:57.586 "read": true, 00:16:57.586 "write": true, 00:16:57.586 "unmap": true, 00:16:57.586 "flush": true, 00:16:57.586 "reset": true, 00:16:57.586 "nvme_admin": false, 00:16:57.586 "nvme_io": false, 00:16:57.586 "nvme_io_md": false, 00:16:57.586 "write_zeroes": true, 00:16:57.586 "zcopy": true, 00:16:57.586 "get_zone_info": false, 00:16:57.586 "zone_management": false, 00:16:57.586 "zone_append": false, 00:16:57.586 "compare": false, 00:16:57.586 "compare_and_write": false, 00:16:57.586 "abort": true, 00:16:57.586 "seek_hole": false, 00:16:57.586 "seek_data": false, 00:16:57.586 "copy": true, 00:16:57.586 "nvme_iov_md": false 00:16:57.586 }, 00:16:57.586 "memory_domains": [ 00:16:57.586 { 00:16:57.586 "dma_device_id": "system", 00:16:57.586 "dma_device_type": 1 00:16:57.586 }, 00:16:57.586 { 00:16:57.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.586 "dma_device_type": 2 00:16:57.586 } 00:16:57.586 ], 00:16:57.586 "driver_specific": {} 00:16:57.586 }' 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:57.586 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:57.844 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:57.844 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:57.844 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:57.844 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:57.844 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:57.844 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:57.844 10:24:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:58.412 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:58.412 "name": "BaseBdev2", 00:16:58.412 "aliases": [ 00:16:58.412 "89b6aa2d-b333-46e8-8179-29193c154b2e" 00:16:58.412 ], 00:16:58.412 "product_name": "Malloc disk", 00:16:58.412 "block_size": 512, 00:16:58.412 "num_blocks": 65536, 00:16:58.412 "uuid": "89b6aa2d-b333-46e8-8179-29193c154b2e", 00:16:58.412 "assigned_rate_limits": { 00:16:58.412 "rw_ios_per_sec": 0, 00:16:58.412 "rw_mbytes_per_sec": 0, 00:16:58.412 "r_mbytes_per_sec": 0, 00:16:58.412 "w_mbytes_per_sec": 0 00:16:58.412 }, 00:16:58.412 "claimed": true, 00:16:58.412 "claim_type": "exclusive_write", 00:16:58.412 "zoned": false, 00:16:58.412 "supported_io_types": { 00:16:58.412 "read": true, 00:16:58.412 "write": true, 00:16:58.412 "unmap": true, 00:16:58.412 "flush": true, 00:16:58.412 "reset": true, 00:16:58.412 "nvme_admin": false, 00:16:58.412 "nvme_io": false, 00:16:58.412 "nvme_io_md": false, 00:16:58.412 "write_zeroes": true, 00:16:58.412 "zcopy": true, 00:16:58.412 "get_zone_info": false, 00:16:58.412 "zone_management": false, 00:16:58.412 "zone_append": false, 00:16:58.412 "compare": false, 00:16:58.412 "compare_and_write": false, 00:16:58.412 "abort": true, 00:16:58.412 "seek_hole": false, 00:16:58.412 "seek_data": false, 00:16:58.412 "copy": true, 00:16:58.412 "nvme_iov_md": false 00:16:58.412 }, 00:16:58.412 "memory_domains": [ 00:16:58.412 { 00:16:58.412 "dma_device_id": "system", 00:16:58.412 "dma_device_type": 1 00:16:58.412 }, 00:16:58.412 { 00:16:58.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.412 "dma_device_type": 2 00:16:58.412 } 00:16:58.412 ], 00:16:58.412 "driver_specific": {} 00:16:58.412 }' 00:16:58.412 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.412 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.412 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:58.412 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.412 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:58.670 10:24:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:58.927 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:58.927 "name": "BaseBdev3", 00:16:58.927 "aliases": [ 00:16:58.927 "22e5632e-b4b5-4c93-93c2-c194235d8281" 00:16:58.927 ], 00:16:58.927 "product_name": "Malloc disk", 00:16:58.927 "block_size": 512, 00:16:58.927 "num_blocks": 65536, 00:16:58.927 "uuid": "22e5632e-b4b5-4c93-93c2-c194235d8281", 00:16:58.927 "assigned_rate_limits": { 00:16:58.927 "rw_ios_per_sec": 0, 00:16:58.927 "rw_mbytes_per_sec": 0, 00:16:58.927 "r_mbytes_per_sec": 0, 00:16:58.927 "w_mbytes_per_sec": 0 00:16:58.927 }, 00:16:58.927 "claimed": true, 00:16:58.927 "claim_type": "exclusive_write", 00:16:58.927 "zoned": false, 00:16:58.927 "supported_io_types": { 00:16:58.927 "read": true, 00:16:58.927 "write": true, 00:16:58.927 "unmap": true, 00:16:58.927 "flush": true, 00:16:58.927 "reset": true, 00:16:58.927 "nvme_admin": false, 00:16:58.927 "nvme_io": false, 00:16:58.927 "nvme_io_md": false, 00:16:58.927 "write_zeroes": true, 00:16:58.927 "zcopy": true, 00:16:58.927 "get_zone_info": false, 00:16:58.927 "zone_management": false, 00:16:58.927 "zone_append": false, 00:16:58.927 "compare": false, 00:16:58.927 "compare_and_write": false, 00:16:58.927 "abort": true, 00:16:58.927 "seek_hole": false, 00:16:58.927 "seek_data": false, 00:16:58.927 "copy": true, 00:16:58.927 "nvme_iov_md": false 00:16:58.927 }, 00:16:58.927 "memory_domains": [ 00:16:58.927 { 00:16:58.927 "dma_device_id": "system", 00:16:58.927 "dma_device_type": 1 00:16:58.927 }, 00:16:58.927 { 00:16:58.927 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.927 "dma_device_type": 2 00:16:58.927 } 00:16:58.927 ], 00:16:58.927 "driver_specific": {} 00:16:58.927 }' 00:16:58.927 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.927 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.185 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:59.450 [2024-07-15 10:24:36.595130] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.450 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:59.791 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.791 "name": "Existed_Raid", 00:16:59.791 "uuid": "6d9a6a69-cc86-4479-a050-dcba6b6df603", 00:16:59.791 "strip_size_kb": 0, 00:16:59.791 "state": "online", 00:16:59.791 "raid_level": "raid1", 00:16:59.791 "superblock": false, 00:16:59.791 "num_base_bdevs": 3, 00:16:59.791 "num_base_bdevs_discovered": 2, 00:16:59.791 "num_base_bdevs_operational": 2, 00:16:59.791 "base_bdevs_list": [ 00:16:59.791 { 00:16:59.791 "name": null, 00:16:59.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.791 "is_configured": false, 00:16:59.791 "data_offset": 0, 00:16:59.791 "data_size": 65536 00:16:59.791 }, 00:16:59.791 { 00:16:59.791 "name": "BaseBdev2", 00:16:59.791 "uuid": "89b6aa2d-b333-46e8-8179-29193c154b2e", 00:16:59.791 "is_configured": true, 00:16:59.791 "data_offset": 0, 00:16:59.792 "data_size": 65536 00:16:59.792 }, 00:16:59.792 { 00:16:59.792 "name": "BaseBdev3", 00:16:59.792 "uuid": "22e5632e-b4b5-4c93-93c2-c194235d8281", 00:16:59.792 "is_configured": true, 00:16:59.792 "data_offset": 0, 00:16:59.792 "data_size": 65536 00:16:59.792 } 00:16:59.792 ] 00:16:59.792 }' 00:16:59.792 10:24:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.792 10:24:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.355 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:00.355 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:00.355 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.355 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:00.612 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:00.612 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:00.612 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:00.870 [2024-07-15 10:24:37.844391] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:00.870 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:00.870 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:00.870 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.870 10:24:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:01.129 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:01.129 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:01.129 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:01.388 [2024-07-15 10:24:38.341647] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:01.388 [2024-07-15 10:24:38.341746] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:01.388 [2024-07-15 10:24:38.354363] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:01.388 [2024-07-15 10:24:38.354404] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:01.388 [2024-07-15 10:24:38.354416] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1796400 name Existed_Raid, state offline 00:17:01.388 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:01.388 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:01.388 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.388 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:01.647 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:01.647 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:01.647 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:01.647 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:01.647 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:01.647 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:01.904 BaseBdev2 00:17:01.905 10:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:01.905 10:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:01.905 10:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:01.905 10:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:01.905 10:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:01.905 10:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:01.905 10:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:02.163 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:02.163 [ 00:17:02.163 { 00:17:02.163 "name": "BaseBdev2", 00:17:02.163 "aliases": [ 00:17:02.163 "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6" 00:17:02.163 ], 00:17:02.163 "product_name": "Malloc disk", 00:17:02.163 "block_size": 512, 00:17:02.163 "num_blocks": 65536, 00:17:02.163 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:02.163 "assigned_rate_limits": { 00:17:02.163 "rw_ios_per_sec": 0, 00:17:02.163 "rw_mbytes_per_sec": 0, 00:17:02.163 "r_mbytes_per_sec": 0, 00:17:02.163 "w_mbytes_per_sec": 0 00:17:02.163 }, 00:17:02.163 "claimed": false, 00:17:02.163 "zoned": false, 00:17:02.163 "supported_io_types": { 00:17:02.163 "read": true, 00:17:02.163 "write": true, 00:17:02.163 "unmap": true, 00:17:02.163 "flush": true, 00:17:02.163 "reset": true, 00:17:02.163 "nvme_admin": false, 00:17:02.163 "nvme_io": false, 00:17:02.163 "nvme_io_md": false, 00:17:02.163 "write_zeroes": true, 00:17:02.163 "zcopy": true, 00:17:02.163 "get_zone_info": false, 00:17:02.163 "zone_management": false, 00:17:02.163 "zone_append": false, 00:17:02.163 "compare": false, 00:17:02.163 "compare_and_write": false, 00:17:02.163 "abort": true, 00:17:02.163 "seek_hole": false, 00:17:02.163 "seek_data": false, 00:17:02.163 "copy": true, 00:17:02.163 "nvme_iov_md": false 00:17:02.163 }, 00:17:02.163 "memory_domains": [ 00:17:02.163 { 00:17:02.163 "dma_device_id": "system", 00:17:02.163 "dma_device_type": 1 00:17:02.163 }, 00:17:02.163 { 00:17:02.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.163 "dma_device_type": 2 00:17:02.163 } 00:17:02.163 ], 00:17:02.163 "driver_specific": {} 00:17:02.163 } 00:17:02.163 ] 00:17:02.163 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:02.163 10:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:02.163 10:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:02.163 10:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:02.422 BaseBdev3 00:17:02.422 10:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:02.422 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:02.422 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:02.422 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:02.422 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:02.422 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:02.422 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:02.680 10:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:02.939 [ 00:17:02.939 { 00:17:02.939 "name": "BaseBdev3", 00:17:02.939 "aliases": [ 00:17:02.939 "e3acdc15-356f-41bc-a488-eecc605eee62" 00:17:02.939 ], 00:17:02.939 "product_name": "Malloc disk", 00:17:02.939 "block_size": 512, 00:17:02.939 "num_blocks": 65536, 00:17:02.939 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:02.939 "assigned_rate_limits": { 00:17:02.939 "rw_ios_per_sec": 0, 00:17:02.939 "rw_mbytes_per_sec": 0, 00:17:02.939 "r_mbytes_per_sec": 0, 00:17:02.939 "w_mbytes_per_sec": 0 00:17:02.939 }, 00:17:02.939 "claimed": false, 00:17:02.939 "zoned": false, 00:17:02.939 "supported_io_types": { 00:17:02.939 "read": true, 00:17:02.939 "write": true, 00:17:02.939 "unmap": true, 00:17:02.939 "flush": true, 00:17:02.939 "reset": true, 00:17:02.939 "nvme_admin": false, 00:17:02.939 "nvme_io": false, 00:17:02.939 "nvme_io_md": false, 00:17:02.939 "write_zeroes": true, 00:17:02.939 "zcopy": true, 00:17:02.939 "get_zone_info": false, 00:17:02.939 "zone_management": false, 00:17:02.939 "zone_append": false, 00:17:02.939 "compare": false, 00:17:02.939 "compare_and_write": false, 00:17:02.939 "abort": true, 00:17:02.939 "seek_hole": false, 00:17:02.939 "seek_data": false, 00:17:02.939 "copy": true, 00:17:02.939 "nvme_iov_md": false 00:17:02.939 }, 00:17:02.939 "memory_domains": [ 00:17:02.939 { 00:17:02.939 "dma_device_id": "system", 00:17:02.939 "dma_device_type": 1 00:17:02.939 }, 00:17:02.939 { 00:17:02.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.939 "dma_device_type": 2 00:17:02.939 } 00:17:02.939 ], 00:17:02.939 "driver_specific": {} 00:17:02.939 } 00:17:02.939 ] 00:17:02.939 10:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:02.939 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:02.939 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:02.939 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:03.198 [2024-07-15 10:24:40.300959] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:03.198 [2024-07-15 10:24:40.301007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:03.198 [2024-07-15 10:24:40.301029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:03.198 [2024-07-15 10:24:40.302422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.198 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:03.456 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:03.456 "name": "Existed_Raid", 00:17:03.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.456 "strip_size_kb": 0, 00:17:03.456 "state": "configuring", 00:17:03.456 "raid_level": "raid1", 00:17:03.456 "superblock": false, 00:17:03.456 "num_base_bdevs": 3, 00:17:03.456 "num_base_bdevs_discovered": 2, 00:17:03.456 "num_base_bdevs_operational": 3, 00:17:03.456 "base_bdevs_list": [ 00:17:03.456 { 00:17:03.456 "name": "BaseBdev1", 00:17:03.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:03.456 "is_configured": false, 00:17:03.456 "data_offset": 0, 00:17:03.456 "data_size": 0 00:17:03.456 }, 00:17:03.456 { 00:17:03.456 "name": "BaseBdev2", 00:17:03.456 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:03.456 "is_configured": true, 00:17:03.456 "data_offset": 0, 00:17:03.456 "data_size": 65536 00:17:03.456 }, 00:17:03.456 { 00:17:03.456 "name": "BaseBdev3", 00:17:03.456 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:03.456 "is_configured": true, 00:17:03.456 "data_offset": 0, 00:17:03.456 "data_size": 65536 00:17:03.456 } 00:17:03.456 ] 00:17:03.456 }' 00:17:03.456 10:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:03.456 10:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:04.022 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:04.281 [2024-07-15 10:24:41.367767] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.281 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.540 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.540 "name": "Existed_Raid", 00:17:04.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.540 "strip_size_kb": 0, 00:17:04.540 "state": "configuring", 00:17:04.540 "raid_level": "raid1", 00:17:04.540 "superblock": false, 00:17:04.540 "num_base_bdevs": 3, 00:17:04.540 "num_base_bdevs_discovered": 1, 00:17:04.540 "num_base_bdevs_operational": 3, 00:17:04.540 "base_bdevs_list": [ 00:17:04.540 { 00:17:04.540 "name": "BaseBdev1", 00:17:04.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.540 "is_configured": false, 00:17:04.540 "data_offset": 0, 00:17:04.540 "data_size": 0 00:17:04.540 }, 00:17:04.540 { 00:17:04.540 "name": null, 00:17:04.540 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:04.540 "is_configured": false, 00:17:04.540 "data_offset": 0, 00:17:04.540 "data_size": 65536 00:17:04.540 }, 00:17:04.540 { 00:17:04.540 "name": "BaseBdev3", 00:17:04.540 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:04.540 "is_configured": true, 00:17:04.540 "data_offset": 0, 00:17:04.540 "data_size": 65536 00:17:04.540 } 00:17:04.540 ] 00:17:04.540 }' 00:17:04.540 10:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.540 10:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.106 10:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.106 10:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:05.365 10:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:05.365 10:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:05.623 [2024-07-15 10:24:42.690680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:05.623 BaseBdev1 00:17:05.623 10:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:05.623 10:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:05.623 10:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:05.623 10:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:05.623 10:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:05.623 10:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:05.623 10:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:05.882 10:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:06.140 [ 00:17:06.140 { 00:17:06.140 "name": "BaseBdev1", 00:17:06.140 "aliases": [ 00:17:06.140 "64e79d2d-67d0-486f-9939-7d54c448e31a" 00:17:06.140 ], 00:17:06.140 "product_name": "Malloc disk", 00:17:06.140 "block_size": 512, 00:17:06.140 "num_blocks": 65536, 00:17:06.140 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:06.140 "assigned_rate_limits": { 00:17:06.140 "rw_ios_per_sec": 0, 00:17:06.140 "rw_mbytes_per_sec": 0, 00:17:06.140 "r_mbytes_per_sec": 0, 00:17:06.140 "w_mbytes_per_sec": 0 00:17:06.140 }, 00:17:06.140 "claimed": true, 00:17:06.140 "claim_type": "exclusive_write", 00:17:06.140 "zoned": false, 00:17:06.140 "supported_io_types": { 00:17:06.140 "read": true, 00:17:06.140 "write": true, 00:17:06.140 "unmap": true, 00:17:06.140 "flush": true, 00:17:06.140 "reset": true, 00:17:06.140 "nvme_admin": false, 00:17:06.140 "nvme_io": false, 00:17:06.140 "nvme_io_md": false, 00:17:06.140 "write_zeroes": true, 00:17:06.140 "zcopy": true, 00:17:06.140 "get_zone_info": false, 00:17:06.140 "zone_management": false, 00:17:06.140 "zone_append": false, 00:17:06.140 "compare": false, 00:17:06.140 "compare_and_write": false, 00:17:06.140 "abort": true, 00:17:06.140 "seek_hole": false, 00:17:06.140 "seek_data": false, 00:17:06.140 "copy": true, 00:17:06.140 "nvme_iov_md": false 00:17:06.140 }, 00:17:06.140 "memory_domains": [ 00:17:06.140 { 00:17:06.140 "dma_device_id": "system", 00:17:06.140 "dma_device_type": 1 00:17:06.140 }, 00:17:06.140 { 00:17:06.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.140 "dma_device_type": 2 00:17:06.140 } 00:17:06.140 ], 00:17:06.140 "driver_specific": {} 00:17:06.140 } 00:17:06.140 ] 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.140 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:06.397 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.397 "name": "Existed_Raid", 00:17:06.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.397 "strip_size_kb": 0, 00:17:06.397 "state": "configuring", 00:17:06.397 "raid_level": "raid1", 00:17:06.397 "superblock": false, 00:17:06.397 "num_base_bdevs": 3, 00:17:06.397 "num_base_bdevs_discovered": 2, 00:17:06.397 "num_base_bdevs_operational": 3, 00:17:06.397 "base_bdevs_list": [ 00:17:06.397 { 00:17:06.397 "name": "BaseBdev1", 00:17:06.397 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:06.397 "is_configured": true, 00:17:06.397 "data_offset": 0, 00:17:06.397 "data_size": 65536 00:17:06.398 }, 00:17:06.398 { 00:17:06.398 "name": null, 00:17:06.398 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:06.398 "is_configured": false, 00:17:06.398 "data_offset": 0, 00:17:06.398 "data_size": 65536 00:17:06.398 }, 00:17:06.398 { 00:17:06.398 "name": "BaseBdev3", 00:17:06.398 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:06.398 "is_configured": true, 00:17:06.398 "data_offset": 0, 00:17:06.398 "data_size": 65536 00:17:06.398 } 00:17:06.398 ] 00:17:06.398 }' 00:17:06.398 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.398 10:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.963 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.963 10:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:07.222 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:07.222 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:07.480 [2024-07-15 10:24:44.451383] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.480 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:07.739 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:07.739 "name": "Existed_Raid", 00:17:07.739 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.739 "strip_size_kb": 0, 00:17:07.739 "state": "configuring", 00:17:07.739 "raid_level": "raid1", 00:17:07.739 "superblock": false, 00:17:07.739 "num_base_bdevs": 3, 00:17:07.739 "num_base_bdevs_discovered": 1, 00:17:07.739 "num_base_bdevs_operational": 3, 00:17:07.739 "base_bdevs_list": [ 00:17:07.739 { 00:17:07.739 "name": "BaseBdev1", 00:17:07.739 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:07.739 "is_configured": true, 00:17:07.739 "data_offset": 0, 00:17:07.739 "data_size": 65536 00:17:07.739 }, 00:17:07.739 { 00:17:07.739 "name": null, 00:17:07.739 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:07.739 "is_configured": false, 00:17:07.739 "data_offset": 0, 00:17:07.739 "data_size": 65536 00:17:07.739 }, 00:17:07.739 { 00:17:07.739 "name": null, 00:17:07.739 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:07.739 "is_configured": false, 00:17:07.739 "data_offset": 0, 00:17:07.739 "data_size": 65536 00:17:07.739 } 00:17:07.739 ] 00:17:07.739 }' 00:17:07.739 10:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:07.739 10:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.305 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.305 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:08.305 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:08.305 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:08.563 [2024-07-15 10:24:45.710749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.563 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.822 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.822 "name": "Existed_Raid", 00:17:08.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.822 "strip_size_kb": 0, 00:17:08.822 "state": "configuring", 00:17:08.822 "raid_level": "raid1", 00:17:08.822 "superblock": false, 00:17:08.822 "num_base_bdevs": 3, 00:17:08.822 "num_base_bdevs_discovered": 2, 00:17:08.822 "num_base_bdevs_operational": 3, 00:17:08.822 "base_bdevs_list": [ 00:17:08.822 { 00:17:08.822 "name": "BaseBdev1", 00:17:08.822 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:08.822 "is_configured": true, 00:17:08.822 "data_offset": 0, 00:17:08.822 "data_size": 65536 00:17:08.822 }, 00:17:08.822 { 00:17:08.822 "name": null, 00:17:08.823 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:08.823 "is_configured": false, 00:17:08.823 "data_offset": 0, 00:17:08.823 "data_size": 65536 00:17:08.823 }, 00:17:08.823 { 00:17:08.823 "name": "BaseBdev3", 00:17:08.823 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:08.823 "is_configured": true, 00:17:08.823 "data_offset": 0, 00:17:08.823 "data_size": 65536 00:17:08.823 } 00:17:08.823 ] 00:17:08.823 }' 00:17:08.823 10:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.823 10:24:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.389 10:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.389 10:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:09.647 10:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:09.647 10:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:09.906 [2024-07-15 10:24:47.034280] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.906 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:10.164 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:10.164 "name": "Existed_Raid", 00:17:10.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:10.164 "strip_size_kb": 0, 00:17:10.164 "state": "configuring", 00:17:10.164 "raid_level": "raid1", 00:17:10.164 "superblock": false, 00:17:10.164 "num_base_bdevs": 3, 00:17:10.164 "num_base_bdevs_discovered": 1, 00:17:10.164 "num_base_bdevs_operational": 3, 00:17:10.164 "base_bdevs_list": [ 00:17:10.164 { 00:17:10.164 "name": null, 00:17:10.164 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:10.164 "is_configured": false, 00:17:10.164 "data_offset": 0, 00:17:10.164 "data_size": 65536 00:17:10.164 }, 00:17:10.164 { 00:17:10.164 "name": null, 00:17:10.164 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:10.164 "is_configured": false, 00:17:10.164 "data_offset": 0, 00:17:10.164 "data_size": 65536 00:17:10.164 }, 00:17:10.164 { 00:17:10.164 "name": "BaseBdev3", 00:17:10.164 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:10.164 "is_configured": true, 00:17:10.164 "data_offset": 0, 00:17:10.164 "data_size": 65536 00:17:10.164 } 00:17:10.164 ] 00:17:10.164 }' 00:17:10.164 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:10.164 10:24:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.729 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:10.729 10:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.987 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:10.987 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:11.246 [2024-07-15 10:24:48.286000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.246 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.812 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.812 "name": "Existed_Raid", 00:17:11.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.812 "strip_size_kb": 0, 00:17:11.812 "state": "configuring", 00:17:11.812 "raid_level": "raid1", 00:17:11.812 "superblock": false, 00:17:11.812 "num_base_bdevs": 3, 00:17:11.812 "num_base_bdevs_discovered": 2, 00:17:11.812 "num_base_bdevs_operational": 3, 00:17:11.812 "base_bdevs_list": [ 00:17:11.812 { 00:17:11.812 "name": null, 00:17:11.812 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:11.812 "is_configured": false, 00:17:11.812 "data_offset": 0, 00:17:11.812 "data_size": 65536 00:17:11.812 }, 00:17:11.812 { 00:17:11.812 "name": "BaseBdev2", 00:17:11.812 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:11.812 "is_configured": true, 00:17:11.812 "data_offset": 0, 00:17:11.812 "data_size": 65536 00:17:11.812 }, 00:17:11.812 { 00:17:11.812 "name": "BaseBdev3", 00:17:11.812 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:11.812 "is_configured": true, 00:17:11.812 "data_offset": 0, 00:17:11.812 "data_size": 65536 00:17:11.812 } 00:17:11.812 ] 00:17:11.812 }' 00:17:11.812 10:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.812 10:24:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.376 10:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:12.376 10:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.634 10:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:12.634 10:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.634 10:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:12.891 10:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 64e79d2d-67d0-486f-9939-7d54c448e31a 00:17:13.491 [2024-07-15 10:24:50.404381] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:13.491 [2024-07-15 10:24:50.404429] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1799e40 00:17:13.491 [2024-07-15 10:24:50.404438] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:13.491 [2024-07-15 10:24:50.404630] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1796e60 00:17:13.491 [2024-07-15 10:24:50.404756] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1799e40 00:17:13.491 [2024-07-15 10:24:50.404766] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1799e40 00:17:13.491 [2024-07-15 10:24:50.404950] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:13.491 NewBaseBdev 00:17:13.491 10:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:13.491 10:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:13.491 10:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:13.491 10:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:13.491 10:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:13.492 10:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:13.492 10:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:13.749 10:24:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:14.007 [ 00:17:14.007 { 00:17:14.007 "name": "NewBaseBdev", 00:17:14.007 "aliases": [ 00:17:14.007 "64e79d2d-67d0-486f-9939-7d54c448e31a" 00:17:14.007 ], 00:17:14.007 "product_name": "Malloc disk", 00:17:14.007 "block_size": 512, 00:17:14.007 "num_blocks": 65536, 00:17:14.007 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:14.007 "assigned_rate_limits": { 00:17:14.007 "rw_ios_per_sec": 0, 00:17:14.007 "rw_mbytes_per_sec": 0, 00:17:14.007 "r_mbytes_per_sec": 0, 00:17:14.007 "w_mbytes_per_sec": 0 00:17:14.007 }, 00:17:14.007 "claimed": true, 00:17:14.007 "claim_type": "exclusive_write", 00:17:14.007 "zoned": false, 00:17:14.007 "supported_io_types": { 00:17:14.007 "read": true, 00:17:14.007 "write": true, 00:17:14.007 "unmap": true, 00:17:14.007 "flush": true, 00:17:14.007 "reset": true, 00:17:14.007 "nvme_admin": false, 00:17:14.007 "nvme_io": false, 00:17:14.007 "nvme_io_md": false, 00:17:14.007 "write_zeroes": true, 00:17:14.007 "zcopy": true, 00:17:14.007 "get_zone_info": false, 00:17:14.007 "zone_management": false, 00:17:14.007 "zone_append": false, 00:17:14.007 "compare": false, 00:17:14.007 "compare_and_write": false, 00:17:14.007 "abort": true, 00:17:14.007 "seek_hole": false, 00:17:14.007 "seek_data": false, 00:17:14.007 "copy": true, 00:17:14.007 "nvme_iov_md": false 00:17:14.007 }, 00:17:14.007 "memory_domains": [ 00:17:14.007 { 00:17:14.007 "dma_device_id": "system", 00:17:14.007 "dma_device_type": 1 00:17:14.007 }, 00:17:14.007 { 00:17:14.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.007 "dma_device_type": 2 00:17:14.007 } 00:17:14.007 ], 00:17:14.007 "driver_specific": {} 00:17:14.007 } 00:17:14.007 ] 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.007 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.573 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.573 "name": "Existed_Raid", 00:17:14.573 "uuid": "f19f90e4-bf94-4fd7-8ed5-d93d9662c0e3", 00:17:14.573 "strip_size_kb": 0, 00:17:14.573 "state": "online", 00:17:14.573 "raid_level": "raid1", 00:17:14.573 "superblock": false, 00:17:14.573 "num_base_bdevs": 3, 00:17:14.573 "num_base_bdevs_discovered": 3, 00:17:14.573 "num_base_bdevs_operational": 3, 00:17:14.573 "base_bdevs_list": [ 00:17:14.573 { 00:17:14.573 "name": "NewBaseBdev", 00:17:14.573 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:14.573 "is_configured": true, 00:17:14.573 "data_offset": 0, 00:17:14.573 "data_size": 65536 00:17:14.573 }, 00:17:14.573 { 00:17:14.573 "name": "BaseBdev2", 00:17:14.573 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:14.573 "is_configured": true, 00:17:14.573 "data_offset": 0, 00:17:14.573 "data_size": 65536 00:17:14.573 }, 00:17:14.573 { 00:17:14.573 "name": "BaseBdev3", 00:17:14.573 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:14.573 "is_configured": true, 00:17:14.573 "data_offset": 0, 00:17:14.573 "data_size": 65536 00:17:14.573 } 00:17:14.573 ] 00:17:14.573 }' 00:17:14.573 10:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.573 10:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:15.140 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:15.398 [2024-07-15 10:24:52.522320] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:15.398 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:15.398 "name": "Existed_Raid", 00:17:15.398 "aliases": [ 00:17:15.398 "f19f90e4-bf94-4fd7-8ed5-d93d9662c0e3" 00:17:15.398 ], 00:17:15.398 "product_name": "Raid Volume", 00:17:15.398 "block_size": 512, 00:17:15.398 "num_blocks": 65536, 00:17:15.398 "uuid": "f19f90e4-bf94-4fd7-8ed5-d93d9662c0e3", 00:17:15.398 "assigned_rate_limits": { 00:17:15.398 "rw_ios_per_sec": 0, 00:17:15.398 "rw_mbytes_per_sec": 0, 00:17:15.398 "r_mbytes_per_sec": 0, 00:17:15.398 "w_mbytes_per_sec": 0 00:17:15.398 }, 00:17:15.398 "claimed": false, 00:17:15.398 "zoned": false, 00:17:15.398 "supported_io_types": { 00:17:15.398 "read": true, 00:17:15.398 "write": true, 00:17:15.398 "unmap": false, 00:17:15.398 "flush": false, 00:17:15.398 "reset": true, 00:17:15.398 "nvme_admin": false, 00:17:15.398 "nvme_io": false, 00:17:15.398 "nvme_io_md": false, 00:17:15.398 "write_zeroes": true, 00:17:15.398 "zcopy": false, 00:17:15.398 "get_zone_info": false, 00:17:15.398 "zone_management": false, 00:17:15.398 "zone_append": false, 00:17:15.398 "compare": false, 00:17:15.398 "compare_and_write": false, 00:17:15.398 "abort": false, 00:17:15.398 "seek_hole": false, 00:17:15.398 "seek_data": false, 00:17:15.398 "copy": false, 00:17:15.398 "nvme_iov_md": false 00:17:15.398 }, 00:17:15.398 "memory_domains": [ 00:17:15.398 { 00:17:15.398 "dma_device_id": "system", 00:17:15.398 "dma_device_type": 1 00:17:15.398 }, 00:17:15.398 { 00:17:15.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.398 "dma_device_type": 2 00:17:15.398 }, 00:17:15.398 { 00:17:15.398 "dma_device_id": "system", 00:17:15.398 "dma_device_type": 1 00:17:15.398 }, 00:17:15.398 { 00:17:15.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.398 "dma_device_type": 2 00:17:15.398 }, 00:17:15.398 { 00:17:15.398 "dma_device_id": "system", 00:17:15.398 "dma_device_type": 1 00:17:15.398 }, 00:17:15.398 { 00:17:15.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.398 "dma_device_type": 2 00:17:15.398 } 00:17:15.398 ], 00:17:15.398 "driver_specific": { 00:17:15.398 "raid": { 00:17:15.398 "uuid": "f19f90e4-bf94-4fd7-8ed5-d93d9662c0e3", 00:17:15.398 "strip_size_kb": 0, 00:17:15.398 "state": "online", 00:17:15.398 "raid_level": "raid1", 00:17:15.398 "superblock": false, 00:17:15.398 "num_base_bdevs": 3, 00:17:15.398 "num_base_bdevs_discovered": 3, 00:17:15.398 "num_base_bdevs_operational": 3, 00:17:15.398 "base_bdevs_list": [ 00:17:15.398 { 00:17:15.398 "name": "NewBaseBdev", 00:17:15.398 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:15.398 "is_configured": true, 00:17:15.398 "data_offset": 0, 00:17:15.398 "data_size": 65536 00:17:15.398 }, 00:17:15.398 { 00:17:15.398 "name": "BaseBdev2", 00:17:15.398 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:15.398 "is_configured": true, 00:17:15.398 "data_offset": 0, 00:17:15.398 "data_size": 65536 00:17:15.398 }, 00:17:15.398 { 00:17:15.398 "name": "BaseBdev3", 00:17:15.398 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:15.398 "is_configured": true, 00:17:15.398 "data_offset": 0, 00:17:15.398 "data_size": 65536 00:17:15.398 } 00:17:15.398 ] 00:17:15.398 } 00:17:15.398 } 00:17:15.398 }' 00:17:15.398 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:15.398 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:15.398 BaseBdev2 00:17:15.398 BaseBdev3' 00:17:15.398 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:15.656 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:15.656 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:15.656 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:15.656 "name": "NewBaseBdev", 00:17:15.656 "aliases": [ 00:17:15.656 "64e79d2d-67d0-486f-9939-7d54c448e31a" 00:17:15.656 ], 00:17:15.656 "product_name": "Malloc disk", 00:17:15.656 "block_size": 512, 00:17:15.656 "num_blocks": 65536, 00:17:15.656 "uuid": "64e79d2d-67d0-486f-9939-7d54c448e31a", 00:17:15.656 "assigned_rate_limits": { 00:17:15.656 "rw_ios_per_sec": 0, 00:17:15.656 "rw_mbytes_per_sec": 0, 00:17:15.656 "r_mbytes_per_sec": 0, 00:17:15.656 "w_mbytes_per_sec": 0 00:17:15.656 }, 00:17:15.656 "claimed": true, 00:17:15.656 "claim_type": "exclusive_write", 00:17:15.656 "zoned": false, 00:17:15.656 "supported_io_types": { 00:17:15.656 "read": true, 00:17:15.656 "write": true, 00:17:15.656 "unmap": true, 00:17:15.656 "flush": true, 00:17:15.656 "reset": true, 00:17:15.656 "nvme_admin": false, 00:17:15.656 "nvme_io": false, 00:17:15.656 "nvme_io_md": false, 00:17:15.656 "write_zeroes": true, 00:17:15.656 "zcopy": true, 00:17:15.656 "get_zone_info": false, 00:17:15.656 "zone_management": false, 00:17:15.656 "zone_append": false, 00:17:15.656 "compare": false, 00:17:15.656 "compare_and_write": false, 00:17:15.656 "abort": true, 00:17:15.656 "seek_hole": false, 00:17:15.656 "seek_data": false, 00:17:15.656 "copy": true, 00:17:15.656 "nvme_iov_md": false 00:17:15.656 }, 00:17:15.656 "memory_domains": [ 00:17:15.656 { 00:17:15.656 "dma_device_id": "system", 00:17:15.656 "dma_device_type": 1 00:17:15.656 }, 00:17:15.656 { 00:17:15.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.656 "dma_device_type": 2 00:17:15.656 } 00:17:15.656 ], 00:17:15.656 "driver_specific": {} 00:17:15.656 }' 00:17:15.656 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:15.914 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:15.914 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:15.914 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:15.914 10:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:15.914 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:15.914 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:15.914 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:15.914 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:15.914 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.171 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.171 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:16.171 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.171 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:16.171 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:16.429 "name": "BaseBdev2", 00:17:16.429 "aliases": [ 00:17:16.429 "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6" 00:17:16.429 ], 00:17:16.429 "product_name": "Malloc disk", 00:17:16.429 "block_size": 512, 00:17:16.429 "num_blocks": 65536, 00:17:16.429 "uuid": "839d792b-d4f9-42c3-9b43-aa2fbf21a0d6", 00:17:16.429 "assigned_rate_limits": { 00:17:16.429 "rw_ios_per_sec": 0, 00:17:16.429 "rw_mbytes_per_sec": 0, 00:17:16.429 "r_mbytes_per_sec": 0, 00:17:16.429 "w_mbytes_per_sec": 0 00:17:16.429 }, 00:17:16.429 "claimed": true, 00:17:16.429 "claim_type": "exclusive_write", 00:17:16.429 "zoned": false, 00:17:16.429 "supported_io_types": { 00:17:16.429 "read": true, 00:17:16.429 "write": true, 00:17:16.429 "unmap": true, 00:17:16.429 "flush": true, 00:17:16.429 "reset": true, 00:17:16.429 "nvme_admin": false, 00:17:16.429 "nvme_io": false, 00:17:16.429 "nvme_io_md": false, 00:17:16.429 "write_zeroes": true, 00:17:16.429 "zcopy": true, 00:17:16.429 "get_zone_info": false, 00:17:16.429 "zone_management": false, 00:17:16.429 "zone_append": false, 00:17:16.429 "compare": false, 00:17:16.429 "compare_and_write": false, 00:17:16.429 "abort": true, 00:17:16.429 "seek_hole": false, 00:17:16.429 "seek_data": false, 00:17:16.429 "copy": true, 00:17:16.429 "nvme_iov_md": false 00:17:16.429 }, 00:17:16.429 "memory_domains": [ 00:17:16.429 { 00:17:16.429 "dma_device_id": "system", 00:17:16.429 "dma_device_type": 1 00:17:16.429 }, 00:17:16.429 { 00:17:16.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.429 "dma_device_type": 2 00:17:16.429 } 00:17:16.429 ], 00:17:16.429 "driver_specific": {} 00:17:16.429 }' 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:16.429 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:16.687 10:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:16.945 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:16.945 "name": "BaseBdev3", 00:17:16.945 "aliases": [ 00:17:16.945 "e3acdc15-356f-41bc-a488-eecc605eee62" 00:17:16.945 ], 00:17:16.945 "product_name": "Malloc disk", 00:17:16.945 "block_size": 512, 00:17:16.945 "num_blocks": 65536, 00:17:16.945 "uuid": "e3acdc15-356f-41bc-a488-eecc605eee62", 00:17:16.945 "assigned_rate_limits": { 00:17:16.945 "rw_ios_per_sec": 0, 00:17:16.945 "rw_mbytes_per_sec": 0, 00:17:16.945 "r_mbytes_per_sec": 0, 00:17:16.945 "w_mbytes_per_sec": 0 00:17:16.945 }, 00:17:16.945 "claimed": true, 00:17:16.945 "claim_type": "exclusive_write", 00:17:16.945 "zoned": false, 00:17:16.945 "supported_io_types": { 00:17:16.945 "read": true, 00:17:16.945 "write": true, 00:17:16.945 "unmap": true, 00:17:16.945 "flush": true, 00:17:16.945 "reset": true, 00:17:16.945 "nvme_admin": false, 00:17:16.945 "nvme_io": false, 00:17:16.945 "nvme_io_md": false, 00:17:16.945 "write_zeroes": true, 00:17:16.945 "zcopy": true, 00:17:16.945 "get_zone_info": false, 00:17:16.945 "zone_management": false, 00:17:16.945 "zone_append": false, 00:17:16.945 "compare": false, 00:17:16.945 "compare_and_write": false, 00:17:16.945 "abort": true, 00:17:16.945 "seek_hole": false, 00:17:16.945 "seek_data": false, 00:17:16.945 "copy": true, 00:17:16.945 "nvme_iov_md": false 00:17:16.945 }, 00:17:16.945 "memory_domains": [ 00:17:16.945 { 00:17:16.945 "dma_device_id": "system", 00:17:16.945 "dma_device_type": 1 00:17:16.945 }, 00:17:16.945 { 00:17:16.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.945 "dma_device_type": 2 00:17:16.945 } 00:17:16.945 ], 00:17:16.945 "driver_specific": {} 00:17:16.945 }' 00:17:16.945 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.945 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:16.945 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:16.945 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:17.204 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:17.462 [2024-07-15 10:24:54.583473] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:17.462 [2024-07-15 10:24:54.583499] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:17.462 [2024-07-15 10:24:54.583553] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:17.463 [2024-07-15 10:24:54.583829] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:17.463 [2024-07-15 10:24:54.583841] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1799e40 name Existed_Raid, state offline 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 521172 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 521172 ']' 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 521172 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 521172 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 521172' 00:17:17.463 killing process with pid 521172 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 521172 00:17:17.463 [2024-07-15 10:24:54.643284] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:17.463 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 521172 00:17:17.721 [2024-07-15 10:24:54.669426] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:17.722 10:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:17.722 00:17:17.722 real 0m28.879s 00:17:17.722 user 0m53.153s 00:17:17.722 sys 0m5.064s 00:17:17.722 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:17.722 10:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:17.722 ************************************ 00:17:17.722 END TEST raid_state_function_test 00:17:17.722 ************************************ 00:17:17.981 10:24:54 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:17.981 10:24:54 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:17:17.981 10:24:54 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:17.981 10:24:54 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:17.981 10:24:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:17.981 ************************************ 00:17:17.981 START TEST raid_state_function_test_sb 00:17:17.981 ************************************ 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=525469 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 525469' 00:17:17.981 Process raid pid: 525469 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 525469 /var/tmp/spdk-raid.sock 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 525469 ']' 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:17.981 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:17.981 10:24:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:17.981 [2024-07-15 10:24:55.036540] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:17:17.981 [2024-07-15 10:24:55.036606] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:17.981 [2024-07-15 10:24:55.165671] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:18.239 [2024-07-15 10:24:55.269742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:18.239 [2024-07-15 10:24:55.339693] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:18.239 [2024-07-15 10:24:55.339723] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:18.805 10:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:18.805 10:24:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:18.805 10:24:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:19.075 [2024-07-15 10:24:56.194748] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:19.075 [2024-07-15 10:24:56.194793] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:19.075 [2024-07-15 10:24:56.194804] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:19.075 [2024-07-15 10:24:56.194816] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:19.075 [2024-07-15 10:24:56.194825] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:19.075 [2024-07-15 10:24:56.194836] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.075 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:19.333 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.333 "name": "Existed_Raid", 00:17:19.333 "uuid": "40fcc715-d80d-4eec-abba-fb8d7f813190", 00:17:19.333 "strip_size_kb": 0, 00:17:19.333 "state": "configuring", 00:17:19.333 "raid_level": "raid1", 00:17:19.333 "superblock": true, 00:17:19.333 "num_base_bdevs": 3, 00:17:19.333 "num_base_bdevs_discovered": 0, 00:17:19.333 "num_base_bdevs_operational": 3, 00:17:19.333 "base_bdevs_list": [ 00:17:19.333 { 00:17:19.333 "name": "BaseBdev1", 00:17:19.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.333 "is_configured": false, 00:17:19.333 "data_offset": 0, 00:17:19.333 "data_size": 0 00:17:19.333 }, 00:17:19.333 { 00:17:19.333 "name": "BaseBdev2", 00:17:19.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.333 "is_configured": false, 00:17:19.333 "data_offset": 0, 00:17:19.333 "data_size": 0 00:17:19.333 }, 00:17:19.333 { 00:17:19.333 "name": "BaseBdev3", 00:17:19.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:19.333 "is_configured": false, 00:17:19.333 "data_offset": 0, 00:17:19.333 "data_size": 0 00:17:19.333 } 00:17:19.333 ] 00:17:19.333 }' 00:17:19.333 10:24:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.333 10:24:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:19.899 10:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:20.157 [2024-07-15 10:24:57.285481] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:20.157 [2024-07-15 10:24:57.285510] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x196ea80 name Existed_Raid, state configuring 00:17:20.157 10:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:20.415 [2024-07-15 10:24:57.465988] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:20.415 [2024-07-15 10:24:57.466015] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:20.415 [2024-07-15 10:24:57.466024] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:20.415 [2024-07-15 10:24:57.466036] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:20.415 [2024-07-15 10:24:57.466044] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:20.415 [2024-07-15 10:24:57.466055] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:20.415 10:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:20.673 [2024-07-15 10:24:57.656344] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:20.673 BaseBdev1 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.673 10:24:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:20.931 [ 00:17:20.931 { 00:17:20.931 "name": "BaseBdev1", 00:17:20.931 "aliases": [ 00:17:20.931 "7343cee0-14a7-4bf4-b22c-5c39f0595acf" 00:17:20.931 ], 00:17:20.931 "product_name": "Malloc disk", 00:17:20.931 "block_size": 512, 00:17:20.931 "num_blocks": 65536, 00:17:20.931 "uuid": "7343cee0-14a7-4bf4-b22c-5c39f0595acf", 00:17:20.931 "assigned_rate_limits": { 00:17:20.931 "rw_ios_per_sec": 0, 00:17:20.931 "rw_mbytes_per_sec": 0, 00:17:20.931 "r_mbytes_per_sec": 0, 00:17:20.931 "w_mbytes_per_sec": 0 00:17:20.931 }, 00:17:20.931 "claimed": true, 00:17:20.931 "claim_type": "exclusive_write", 00:17:20.931 "zoned": false, 00:17:20.931 "supported_io_types": { 00:17:20.931 "read": true, 00:17:20.931 "write": true, 00:17:20.931 "unmap": true, 00:17:20.931 "flush": true, 00:17:20.931 "reset": true, 00:17:20.931 "nvme_admin": false, 00:17:20.931 "nvme_io": false, 00:17:20.931 "nvme_io_md": false, 00:17:20.931 "write_zeroes": true, 00:17:20.931 "zcopy": true, 00:17:20.931 "get_zone_info": false, 00:17:20.931 "zone_management": false, 00:17:20.931 "zone_append": false, 00:17:20.931 "compare": false, 00:17:20.931 "compare_and_write": false, 00:17:20.931 "abort": true, 00:17:20.931 "seek_hole": false, 00:17:20.931 "seek_data": false, 00:17:20.931 "copy": true, 00:17:20.931 "nvme_iov_md": false 00:17:20.931 }, 00:17:20.931 "memory_domains": [ 00:17:20.931 { 00:17:20.931 "dma_device_id": "system", 00:17:20.931 "dma_device_type": 1 00:17:20.931 }, 00:17:20.931 { 00:17:20.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.931 "dma_device_type": 2 00:17:20.931 } 00:17:20.931 ], 00:17:20.931 "driver_specific": {} 00:17:20.931 } 00:17:20.931 ] 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.931 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.189 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.189 "name": "Existed_Raid", 00:17:21.189 "uuid": "ce302b8d-0cb5-41c1-8a3e-a16182f89929", 00:17:21.189 "strip_size_kb": 0, 00:17:21.189 "state": "configuring", 00:17:21.189 "raid_level": "raid1", 00:17:21.189 "superblock": true, 00:17:21.189 "num_base_bdevs": 3, 00:17:21.189 "num_base_bdevs_discovered": 1, 00:17:21.189 "num_base_bdevs_operational": 3, 00:17:21.189 "base_bdevs_list": [ 00:17:21.189 { 00:17:21.189 "name": "BaseBdev1", 00:17:21.189 "uuid": "7343cee0-14a7-4bf4-b22c-5c39f0595acf", 00:17:21.189 "is_configured": true, 00:17:21.189 "data_offset": 2048, 00:17:21.189 "data_size": 63488 00:17:21.189 }, 00:17:21.189 { 00:17:21.189 "name": "BaseBdev2", 00:17:21.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.189 "is_configured": false, 00:17:21.189 "data_offset": 0, 00:17:21.189 "data_size": 0 00:17:21.189 }, 00:17:21.189 { 00:17:21.189 "name": "BaseBdev3", 00:17:21.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.189 "is_configured": false, 00:17:21.189 "data_offset": 0, 00:17:21.189 "data_size": 0 00:17:21.189 } 00:17:21.189 ] 00:17:21.189 }' 00:17:21.189 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.189 10:24:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:21.755 10:24:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:22.014 [2024-07-15 10:24:59.007939] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:22.014 [2024-07-15 10:24:59.007977] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x196e310 name Existed_Raid, state configuring 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:22.014 [2024-07-15 10:24:59.172420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:22.014 [2024-07-15 10:24:59.173939] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:22.014 [2024-07-15 10:24:59.173971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:22.014 [2024-07-15 10:24:59.173981] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:22.014 [2024-07-15 10:24:59.173992] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.014 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.273 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.273 "name": "Existed_Raid", 00:17:22.273 "uuid": "e920bd06-4f7f-401a-a415-61d9980f6d93", 00:17:22.273 "strip_size_kb": 0, 00:17:22.273 "state": "configuring", 00:17:22.273 "raid_level": "raid1", 00:17:22.273 "superblock": true, 00:17:22.273 "num_base_bdevs": 3, 00:17:22.273 "num_base_bdevs_discovered": 1, 00:17:22.273 "num_base_bdevs_operational": 3, 00:17:22.273 "base_bdevs_list": [ 00:17:22.273 { 00:17:22.273 "name": "BaseBdev1", 00:17:22.273 "uuid": "7343cee0-14a7-4bf4-b22c-5c39f0595acf", 00:17:22.273 "is_configured": true, 00:17:22.273 "data_offset": 2048, 00:17:22.273 "data_size": 63488 00:17:22.273 }, 00:17:22.273 { 00:17:22.273 "name": "BaseBdev2", 00:17:22.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.273 "is_configured": false, 00:17:22.273 "data_offset": 0, 00:17:22.273 "data_size": 0 00:17:22.273 }, 00:17:22.273 { 00:17:22.273 "name": "BaseBdev3", 00:17:22.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.273 "is_configured": false, 00:17:22.273 "data_offset": 0, 00:17:22.273 "data_size": 0 00:17:22.273 } 00:17:22.273 ] 00:17:22.273 }' 00:17:22.273 10:24:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.273 10:24:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:23.208 [2024-07-15 10:25:00.286770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:23.208 BaseBdev2 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:23.208 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:23.466 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:23.724 [ 00:17:23.724 { 00:17:23.724 "name": "BaseBdev2", 00:17:23.724 "aliases": [ 00:17:23.724 "c621a32c-ee9c-4b77-99d5-15384ef192ba" 00:17:23.724 ], 00:17:23.724 "product_name": "Malloc disk", 00:17:23.724 "block_size": 512, 00:17:23.724 "num_blocks": 65536, 00:17:23.724 "uuid": "c621a32c-ee9c-4b77-99d5-15384ef192ba", 00:17:23.724 "assigned_rate_limits": { 00:17:23.724 "rw_ios_per_sec": 0, 00:17:23.724 "rw_mbytes_per_sec": 0, 00:17:23.724 "r_mbytes_per_sec": 0, 00:17:23.724 "w_mbytes_per_sec": 0 00:17:23.724 }, 00:17:23.724 "claimed": true, 00:17:23.724 "claim_type": "exclusive_write", 00:17:23.724 "zoned": false, 00:17:23.724 "supported_io_types": { 00:17:23.724 "read": true, 00:17:23.724 "write": true, 00:17:23.724 "unmap": true, 00:17:23.724 "flush": true, 00:17:23.724 "reset": true, 00:17:23.724 "nvme_admin": false, 00:17:23.724 "nvme_io": false, 00:17:23.724 "nvme_io_md": false, 00:17:23.724 "write_zeroes": true, 00:17:23.724 "zcopy": true, 00:17:23.724 "get_zone_info": false, 00:17:23.724 "zone_management": false, 00:17:23.724 "zone_append": false, 00:17:23.724 "compare": false, 00:17:23.724 "compare_and_write": false, 00:17:23.724 "abort": true, 00:17:23.724 "seek_hole": false, 00:17:23.724 "seek_data": false, 00:17:23.724 "copy": true, 00:17:23.724 "nvme_iov_md": false 00:17:23.724 }, 00:17:23.724 "memory_domains": [ 00:17:23.724 { 00:17:23.724 "dma_device_id": "system", 00:17:23.724 "dma_device_type": 1 00:17:23.724 }, 00:17:23.724 { 00:17:23.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.724 "dma_device_type": 2 00:17:23.724 } 00:17:23.724 ], 00:17:23.724 "driver_specific": {} 00:17:23.724 } 00:17:23.724 ] 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:23.724 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:23.725 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:23.725 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.725 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.725 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.725 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.725 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.725 10:25:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.982 10:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.982 "name": "Existed_Raid", 00:17:23.982 "uuid": "e920bd06-4f7f-401a-a415-61d9980f6d93", 00:17:23.982 "strip_size_kb": 0, 00:17:23.982 "state": "configuring", 00:17:23.982 "raid_level": "raid1", 00:17:23.982 "superblock": true, 00:17:23.982 "num_base_bdevs": 3, 00:17:23.982 "num_base_bdevs_discovered": 2, 00:17:23.982 "num_base_bdevs_operational": 3, 00:17:23.982 "base_bdevs_list": [ 00:17:23.982 { 00:17:23.982 "name": "BaseBdev1", 00:17:23.982 "uuid": "7343cee0-14a7-4bf4-b22c-5c39f0595acf", 00:17:23.982 "is_configured": true, 00:17:23.982 "data_offset": 2048, 00:17:23.982 "data_size": 63488 00:17:23.982 }, 00:17:23.982 { 00:17:23.982 "name": "BaseBdev2", 00:17:23.982 "uuid": "c621a32c-ee9c-4b77-99d5-15384ef192ba", 00:17:23.982 "is_configured": true, 00:17:23.982 "data_offset": 2048, 00:17:23.982 "data_size": 63488 00:17:23.982 }, 00:17:23.982 { 00:17:23.982 "name": "BaseBdev3", 00:17:23.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.982 "is_configured": false, 00:17:23.982 "data_offset": 0, 00:17:23.982 "data_size": 0 00:17:23.982 } 00:17:23.982 ] 00:17:23.982 }' 00:17:23.982 10:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.982 10:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.549 10:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:24.808 [2024-07-15 10:25:01.858412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:24.808 [2024-07-15 10:25:01.858568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x196f400 00:17:24.808 [2024-07-15 10:25:01.858583] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:24.808 [2024-07-15 10:25:01.858749] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x196eef0 00:17:24.808 [2024-07-15 10:25:01.858868] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x196f400 00:17:24.808 [2024-07-15 10:25:01.858879] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x196f400 00:17:24.808 [2024-07-15 10:25:01.858977] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:24.808 BaseBdev3 00:17:24.808 10:25:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:24.808 10:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:24.808 10:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:24.808 10:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:24.808 10:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:24.808 10:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:24.808 10:25:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:25.066 10:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:25.324 [ 00:17:25.324 { 00:17:25.324 "name": "BaseBdev3", 00:17:25.324 "aliases": [ 00:17:25.324 "07e02611-3ce2-488e-a767-8decf75e829b" 00:17:25.324 ], 00:17:25.324 "product_name": "Malloc disk", 00:17:25.324 "block_size": 512, 00:17:25.324 "num_blocks": 65536, 00:17:25.324 "uuid": "07e02611-3ce2-488e-a767-8decf75e829b", 00:17:25.324 "assigned_rate_limits": { 00:17:25.324 "rw_ios_per_sec": 0, 00:17:25.324 "rw_mbytes_per_sec": 0, 00:17:25.324 "r_mbytes_per_sec": 0, 00:17:25.324 "w_mbytes_per_sec": 0 00:17:25.324 }, 00:17:25.324 "claimed": true, 00:17:25.324 "claim_type": "exclusive_write", 00:17:25.324 "zoned": false, 00:17:25.324 "supported_io_types": { 00:17:25.324 "read": true, 00:17:25.324 "write": true, 00:17:25.324 "unmap": true, 00:17:25.324 "flush": true, 00:17:25.324 "reset": true, 00:17:25.324 "nvme_admin": false, 00:17:25.324 "nvme_io": false, 00:17:25.324 "nvme_io_md": false, 00:17:25.324 "write_zeroes": true, 00:17:25.324 "zcopy": true, 00:17:25.324 "get_zone_info": false, 00:17:25.324 "zone_management": false, 00:17:25.324 "zone_append": false, 00:17:25.324 "compare": false, 00:17:25.324 "compare_and_write": false, 00:17:25.324 "abort": true, 00:17:25.324 "seek_hole": false, 00:17:25.324 "seek_data": false, 00:17:25.324 "copy": true, 00:17:25.324 "nvme_iov_md": false 00:17:25.324 }, 00:17:25.324 "memory_domains": [ 00:17:25.324 { 00:17:25.324 "dma_device_id": "system", 00:17:25.324 "dma_device_type": 1 00:17:25.324 }, 00:17:25.324 { 00:17:25.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.324 "dma_device_type": 2 00:17:25.324 } 00:17:25.324 ], 00:17:25.324 "driver_specific": {} 00:17:25.324 } 00:17:25.324 ] 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.324 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.582 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.582 "name": "Existed_Raid", 00:17:25.582 "uuid": "e920bd06-4f7f-401a-a415-61d9980f6d93", 00:17:25.582 "strip_size_kb": 0, 00:17:25.582 "state": "online", 00:17:25.582 "raid_level": "raid1", 00:17:25.582 "superblock": true, 00:17:25.582 "num_base_bdevs": 3, 00:17:25.582 "num_base_bdevs_discovered": 3, 00:17:25.582 "num_base_bdevs_operational": 3, 00:17:25.582 "base_bdevs_list": [ 00:17:25.582 { 00:17:25.582 "name": "BaseBdev1", 00:17:25.582 "uuid": "7343cee0-14a7-4bf4-b22c-5c39f0595acf", 00:17:25.582 "is_configured": true, 00:17:25.582 "data_offset": 2048, 00:17:25.582 "data_size": 63488 00:17:25.582 }, 00:17:25.582 { 00:17:25.582 "name": "BaseBdev2", 00:17:25.582 "uuid": "c621a32c-ee9c-4b77-99d5-15384ef192ba", 00:17:25.582 "is_configured": true, 00:17:25.582 "data_offset": 2048, 00:17:25.582 "data_size": 63488 00:17:25.582 }, 00:17:25.582 { 00:17:25.582 "name": "BaseBdev3", 00:17:25.582 "uuid": "07e02611-3ce2-488e-a767-8decf75e829b", 00:17:25.582 "is_configured": true, 00:17:25.582 "data_offset": 2048, 00:17:25.582 "data_size": 63488 00:17:25.582 } 00:17:25.582 ] 00:17:25.582 }' 00:17:25.582 10:25:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.582 10:25:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:26.148 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:26.406 [2024-07-15 10:25:03.390827] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:26.406 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:26.406 "name": "Existed_Raid", 00:17:26.406 "aliases": [ 00:17:26.406 "e920bd06-4f7f-401a-a415-61d9980f6d93" 00:17:26.406 ], 00:17:26.406 "product_name": "Raid Volume", 00:17:26.406 "block_size": 512, 00:17:26.406 "num_blocks": 63488, 00:17:26.406 "uuid": "e920bd06-4f7f-401a-a415-61d9980f6d93", 00:17:26.406 "assigned_rate_limits": { 00:17:26.406 "rw_ios_per_sec": 0, 00:17:26.406 "rw_mbytes_per_sec": 0, 00:17:26.406 "r_mbytes_per_sec": 0, 00:17:26.406 "w_mbytes_per_sec": 0 00:17:26.406 }, 00:17:26.406 "claimed": false, 00:17:26.406 "zoned": false, 00:17:26.406 "supported_io_types": { 00:17:26.406 "read": true, 00:17:26.406 "write": true, 00:17:26.406 "unmap": false, 00:17:26.406 "flush": false, 00:17:26.406 "reset": true, 00:17:26.406 "nvme_admin": false, 00:17:26.406 "nvme_io": false, 00:17:26.406 "nvme_io_md": false, 00:17:26.406 "write_zeroes": true, 00:17:26.406 "zcopy": false, 00:17:26.406 "get_zone_info": false, 00:17:26.406 "zone_management": false, 00:17:26.406 "zone_append": false, 00:17:26.406 "compare": false, 00:17:26.406 "compare_and_write": false, 00:17:26.406 "abort": false, 00:17:26.406 "seek_hole": false, 00:17:26.406 "seek_data": false, 00:17:26.406 "copy": false, 00:17:26.406 "nvme_iov_md": false 00:17:26.406 }, 00:17:26.406 "memory_domains": [ 00:17:26.406 { 00:17:26.406 "dma_device_id": "system", 00:17:26.406 "dma_device_type": 1 00:17:26.406 }, 00:17:26.406 { 00:17:26.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.406 "dma_device_type": 2 00:17:26.406 }, 00:17:26.406 { 00:17:26.406 "dma_device_id": "system", 00:17:26.406 "dma_device_type": 1 00:17:26.406 }, 00:17:26.406 { 00:17:26.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.406 "dma_device_type": 2 00:17:26.406 }, 00:17:26.406 { 00:17:26.406 "dma_device_id": "system", 00:17:26.406 "dma_device_type": 1 00:17:26.406 }, 00:17:26.406 { 00:17:26.406 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.406 "dma_device_type": 2 00:17:26.406 } 00:17:26.406 ], 00:17:26.406 "driver_specific": { 00:17:26.406 "raid": { 00:17:26.406 "uuid": "e920bd06-4f7f-401a-a415-61d9980f6d93", 00:17:26.406 "strip_size_kb": 0, 00:17:26.406 "state": "online", 00:17:26.406 "raid_level": "raid1", 00:17:26.406 "superblock": true, 00:17:26.406 "num_base_bdevs": 3, 00:17:26.406 "num_base_bdevs_discovered": 3, 00:17:26.406 "num_base_bdevs_operational": 3, 00:17:26.406 "base_bdevs_list": [ 00:17:26.406 { 00:17:26.406 "name": "BaseBdev1", 00:17:26.406 "uuid": "7343cee0-14a7-4bf4-b22c-5c39f0595acf", 00:17:26.406 "is_configured": true, 00:17:26.406 "data_offset": 2048, 00:17:26.406 "data_size": 63488 00:17:26.406 }, 00:17:26.406 { 00:17:26.406 "name": "BaseBdev2", 00:17:26.406 "uuid": "c621a32c-ee9c-4b77-99d5-15384ef192ba", 00:17:26.406 "is_configured": true, 00:17:26.406 "data_offset": 2048, 00:17:26.406 "data_size": 63488 00:17:26.406 }, 00:17:26.406 { 00:17:26.406 "name": "BaseBdev3", 00:17:26.406 "uuid": "07e02611-3ce2-488e-a767-8decf75e829b", 00:17:26.406 "is_configured": true, 00:17:26.406 "data_offset": 2048, 00:17:26.406 "data_size": 63488 00:17:26.406 } 00:17:26.406 ] 00:17:26.406 } 00:17:26.406 } 00:17:26.406 }' 00:17:26.406 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:26.406 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:26.406 BaseBdev2 00:17:26.406 BaseBdev3' 00:17:26.406 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:26.406 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:26.406 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:26.664 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:26.664 "name": "BaseBdev1", 00:17:26.664 "aliases": [ 00:17:26.664 "7343cee0-14a7-4bf4-b22c-5c39f0595acf" 00:17:26.664 ], 00:17:26.664 "product_name": "Malloc disk", 00:17:26.664 "block_size": 512, 00:17:26.664 "num_blocks": 65536, 00:17:26.664 "uuid": "7343cee0-14a7-4bf4-b22c-5c39f0595acf", 00:17:26.664 "assigned_rate_limits": { 00:17:26.664 "rw_ios_per_sec": 0, 00:17:26.664 "rw_mbytes_per_sec": 0, 00:17:26.664 "r_mbytes_per_sec": 0, 00:17:26.664 "w_mbytes_per_sec": 0 00:17:26.664 }, 00:17:26.664 "claimed": true, 00:17:26.664 "claim_type": "exclusive_write", 00:17:26.664 "zoned": false, 00:17:26.664 "supported_io_types": { 00:17:26.664 "read": true, 00:17:26.664 "write": true, 00:17:26.664 "unmap": true, 00:17:26.664 "flush": true, 00:17:26.664 "reset": true, 00:17:26.664 "nvme_admin": false, 00:17:26.664 "nvme_io": false, 00:17:26.664 "nvme_io_md": false, 00:17:26.664 "write_zeroes": true, 00:17:26.664 "zcopy": true, 00:17:26.664 "get_zone_info": false, 00:17:26.664 "zone_management": false, 00:17:26.664 "zone_append": false, 00:17:26.664 "compare": false, 00:17:26.664 "compare_and_write": false, 00:17:26.664 "abort": true, 00:17:26.664 "seek_hole": false, 00:17:26.664 "seek_data": false, 00:17:26.664 "copy": true, 00:17:26.664 "nvme_iov_md": false 00:17:26.664 }, 00:17:26.664 "memory_domains": [ 00:17:26.664 { 00:17:26.664 "dma_device_id": "system", 00:17:26.664 "dma_device_type": 1 00:17:26.664 }, 00:17:26.664 { 00:17:26.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.664 "dma_device_type": 2 00:17:26.664 } 00:17:26.664 ], 00:17:26.664 "driver_specific": {} 00:17:26.664 }' 00:17:26.664 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.664 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.664 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:26.664 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.665 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.923 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:26.923 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.923 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.923 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:26.923 10:25:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.923 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.923 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:26.923 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:26.923 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:26.923 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:27.181 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:27.181 "name": "BaseBdev2", 00:17:27.181 "aliases": [ 00:17:27.181 "c621a32c-ee9c-4b77-99d5-15384ef192ba" 00:17:27.181 ], 00:17:27.181 "product_name": "Malloc disk", 00:17:27.181 "block_size": 512, 00:17:27.181 "num_blocks": 65536, 00:17:27.181 "uuid": "c621a32c-ee9c-4b77-99d5-15384ef192ba", 00:17:27.181 "assigned_rate_limits": { 00:17:27.181 "rw_ios_per_sec": 0, 00:17:27.181 "rw_mbytes_per_sec": 0, 00:17:27.181 "r_mbytes_per_sec": 0, 00:17:27.181 "w_mbytes_per_sec": 0 00:17:27.181 }, 00:17:27.181 "claimed": true, 00:17:27.181 "claim_type": "exclusive_write", 00:17:27.181 "zoned": false, 00:17:27.181 "supported_io_types": { 00:17:27.181 "read": true, 00:17:27.181 "write": true, 00:17:27.181 "unmap": true, 00:17:27.181 "flush": true, 00:17:27.181 "reset": true, 00:17:27.181 "nvme_admin": false, 00:17:27.181 "nvme_io": false, 00:17:27.181 "nvme_io_md": false, 00:17:27.181 "write_zeroes": true, 00:17:27.181 "zcopy": true, 00:17:27.181 "get_zone_info": false, 00:17:27.181 "zone_management": false, 00:17:27.181 "zone_append": false, 00:17:27.181 "compare": false, 00:17:27.181 "compare_and_write": false, 00:17:27.181 "abort": true, 00:17:27.181 "seek_hole": false, 00:17:27.181 "seek_data": false, 00:17:27.181 "copy": true, 00:17:27.181 "nvme_iov_md": false 00:17:27.181 }, 00:17:27.181 "memory_domains": [ 00:17:27.181 { 00:17:27.181 "dma_device_id": "system", 00:17:27.181 "dma_device_type": 1 00:17:27.181 }, 00:17:27.181 { 00:17:27.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.181 "dma_device_type": 2 00:17:27.181 } 00:17:27.181 ], 00:17:27.181 "driver_specific": {} 00:17:27.181 }' 00:17:27.181 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.181 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.181 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:27.440 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:27.698 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:27.698 "name": "BaseBdev3", 00:17:27.698 "aliases": [ 00:17:27.698 "07e02611-3ce2-488e-a767-8decf75e829b" 00:17:27.698 ], 00:17:27.698 "product_name": "Malloc disk", 00:17:27.698 "block_size": 512, 00:17:27.698 "num_blocks": 65536, 00:17:27.698 "uuid": "07e02611-3ce2-488e-a767-8decf75e829b", 00:17:27.698 "assigned_rate_limits": { 00:17:27.698 "rw_ios_per_sec": 0, 00:17:27.698 "rw_mbytes_per_sec": 0, 00:17:27.698 "r_mbytes_per_sec": 0, 00:17:27.698 "w_mbytes_per_sec": 0 00:17:27.698 }, 00:17:27.698 "claimed": true, 00:17:27.698 "claim_type": "exclusive_write", 00:17:27.698 "zoned": false, 00:17:27.698 "supported_io_types": { 00:17:27.698 "read": true, 00:17:27.698 "write": true, 00:17:27.698 "unmap": true, 00:17:27.698 "flush": true, 00:17:27.698 "reset": true, 00:17:27.698 "nvme_admin": false, 00:17:27.698 "nvme_io": false, 00:17:27.698 "nvme_io_md": false, 00:17:27.698 "write_zeroes": true, 00:17:27.698 "zcopy": true, 00:17:27.698 "get_zone_info": false, 00:17:27.698 "zone_management": false, 00:17:27.698 "zone_append": false, 00:17:27.698 "compare": false, 00:17:27.698 "compare_and_write": false, 00:17:27.698 "abort": true, 00:17:27.698 "seek_hole": false, 00:17:27.698 "seek_data": false, 00:17:27.698 "copy": true, 00:17:27.698 "nvme_iov_md": false 00:17:27.698 }, 00:17:27.698 "memory_domains": [ 00:17:27.698 { 00:17:27.698 "dma_device_id": "system", 00:17:27.698 "dma_device_type": 1 00:17:27.698 }, 00:17:27.698 { 00:17:27.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.698 "dma_device_type": 2 00:17:27.698 } 00:17:27.698 ], 00:17:27.698 "driver_specific": {} 00:17:27.698 }' 00:17:27.698 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.957 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:27.957 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:27.957 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.957 10:25:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:27.957 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:27.957 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.957 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:27.957 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:27.957 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.216 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:28.216 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:28.216 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:28.475 [2024-07-15 10:25:05.436009] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.475 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.733 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.733 "name": "Existed_Raid", 00:17:28.733 "uuid": "e920bd06-4f7f-401a-a415-61d9980f6d93", 00:17:28.733 "strip_size_kb": 0, 00:17:28.733 "state": "online", 00:17:28.733 "raid_level": "raid1", 00:17:28.733 "superblock": true, 00:17:28.733 "num_base_bdevs": 3, 00:17:28.733 "num_base_bdevs_discovered": 2, 00:17:28.733 "num_base_bdevs_operational": 2, 00:17:28.733 "base_bdevs_list": [ 00:17:28.733 { 00:17:28.733 "name": null, 00:17:28.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.734 "is_configured": false, 00:17:28.734 "data_offset": 2048, 00:17:28.734 "data_size": 63488 00:17:28.734 }, 00:17:28.734 { 00:17:28.734 "name": "BaseBdev2", 00:17:28.734 "uuid": "c621a32c-ee9c-4b77-99d5-15384ef192ba", 00:17:28.734 "is_configured": true, 00:17:28.734 "data_offset": 2048, 00:17:28.734 "data_size": 63488 00:17:28.734 }, 00:17:28.734 { 00:17:28.734 "name": "BaseBdev3", 00:17:28.734 "uuid": "07e02611-3ce2-488e-a767-8decf75e829b", 00:17:28.734 "is_configured": true, 00:17:28.734 "data_offset": 2048, 00:17:28.734 "data_size": 63488 00:17:28.734 } 00:17:28.734 ] 00:17:28.734 }' 00:17:28.734 10:25:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.734 10:25:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:29.298 10:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:29.298 10:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:29.298 10:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.298 10:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:29.555 10:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:29.555 10:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:29.555 10:25:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:30.120 [2024-07-15 10:25:07.025322] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:30.120 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:30.120 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:30.120 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.120 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:30.120 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:30.120 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:30.120 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:30.378 [2024-07-15 10:25:07.541465] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:30.378 [2024-07-15 10:25:07.541559] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:30.378 [2024-07-15 10:25:07.554261] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:30.378 [2024-07-15 10:25:07.554299] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:30.378 [2024-07-15 10:25:07.554312] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x196f400 name Existed_Raid, state offline 00:17:30.378 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:30.378 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:30.637 10:25:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:31.246 BaseBdev2 00:17:31.246 10:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:31.246 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:31.246 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:31.246 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:31.246 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:31.246 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:31.246 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:31.504 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:31.762 [ 00:17:31.762 { 00:17:31.762 "name": "BaseBdev2", 00:17:31.762 "aliases": [ 00:17:31.762 "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157" 00:17:31.762 ], 00:17:31.762 "product_name": "Malloc disk", 00:17:31.762 "block_size": 512, 00:17:31.762 "num_blocks": 65536, 00:17:31.762 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:31.762 "assigned_rate_limits": { 00:17:31.762 "rw_ios_per_sec": 0, 00:17:31.762 "rw_mbytes_per_sec": 0, 00:17:31.762 "r_mbytes_per_sec": 0, 00:17:31.762 "w_mbytes_per_sec": 0 00:17:31.762 }, 00:17:31.762 "claimed": false, 00:17:31.762 "zoned": false, 00:17:31.762 "supported_io_types": { 00:17:31.762 "read": true, 00:17:31.762 "write": true, 00:17:31.762 "unmap": true, 00:17:31.762 "flush": true, 00:17:31.762 "reset": true, 00:17:31.762 "nvme_admin": false, 00:17:31.762 "nvme_io": false, 00:17:31.762 "nvme_io_md": false, 00:17:31.762 "write_zeroes": true, 00:17:31.762 "zcopy": true, 00:17:31.762 "get_zone_info": false, 00:17:31.762 "zone_management": false, 00:17:31.762 "zone_append": false, 00:17:31.762 "compare": false, 00:17:31.762 "compare_and_write": false, 00:17:31.762 "abort": true, 00:17:31.762 "seek_hole": false, 00:17:31.762 "seek_data": false, 00:17:31.762 "copy": true, 00:17:31.762 "nvme_iov_md": false 00:17:31.762 }, 00:17:31.762 "memory_domains": [ 00:17:31.762 { 00:17:31.762 "dma_device_id": "system", 00:17:31.762 "dma_device_type": 1 00:17:31.762 }, 00:17:31.762 { 00:17:31.762 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.762 "dma_device_type": 2 00:17:31.762 } 00:17:31.762 ], 00:17:31.762 "driver_specific": {} 00:17:31.762 } 00:17:31.762 ] 00:17:31.762 10:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:31.762 10:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:31.762 10:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:31.762 10:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:32.327 BaseBdev3 00:17:32.327 10:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:32.327 10:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:32.327 10:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:32.327 10:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:32.327 10:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:32.327 10:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:32.327 10:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:32.584 10:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:33.149 [ 00:17:33.149 { 00:17:33.149 "name": "BaseBdev3", 00:17:33.149 "aliases": [ 00:17:33.149 "d3860442-ae51-4507-bd9b-b4657e3e56c5" 00:17:33.149 ], 00:17:33.149 "product_name": "Malloc disk", 00:17:33.149 "block_size": 512, 00:17:33.149 "num_blocks": 65536, 00:17:33.149 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:33.149 "assigned_rate_limits": { 00:17:33.149 "rw_ios_per_sec": 0, 00:17:33.149 "rw_mbytes_per_sec": 0, 00:17:33.149 "r_mbytes_per_sec": 0, 00:17:33.149 "w_mbytes_per_sec": 0 00:17:33.149 }, 00:17:33.149 "claimed": false, 00:17:33.149 "zoned": false, 00:17:33.149 "supported_io_types": { 00:17:33.149 "read": true, 00:17:33.149 "write": true, 00:17:33.149 "unmap": true, 00:17:33.149 "flush": true, 00:17:33.149 "reset": true, 00:17:33.149 "nvme_admin": false, 00:17:33.149 "nvme_io": false, 00:17:33.149 "nvme_io_md": false, 00:17:33.149 "write_zeroes": true, 00:17:33.149 "zcopy": true, 00:17:33.149 "get_zone_info": false, 00:17:33.149 "zone_management": false, 00:17:33.149 "zone_append": false, 00:17:33.149 "compare": false, 00:17:33.149 "compare_and_write": false, 00:17:33.149 "abort": true, 00:17:33.149 "seek_hole": false, 00:17:33.149 "seek_data": false, 00:17:33.149 "copy": true, 00:17:33.149 "nvme_iov_md": false 00:17:33.149 }, 00:17:33.149 "memory_domains": [ 00:17:33.149 { 00:17:33.149 "dma_device_id": "system", 00:17:33.149 "dma_device_type": 1 00:17:33.149 }, 00:17:33.149 { 00:17:33.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.149 "dma_device_type": 2 00:17:33.149 } 00:17:33.149 ], 00:17:33.149 "driver_specific": {} 00:17:33.149 } 00:17:33.149 ] 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:33.149 [2024-07-15 10:25:10.323352] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:33.149 [2024-07-15 10:25:10.323393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:33.149 [2024-07-15 10:25:10.323413] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:33.149 [2024-07-15 10:25:10.324739] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.149 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.407 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.407 "name": "Existed_Raid", 00:17:33.407 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:33.407 "strip_size_kb": 0, 00:17:33.407 "state": "configuring", 00:17:33.407 "raid_level": "raid1", 00:17:33.407 "superblock": true, 00:17:33.407 "num_base_bdevs": 3, 00:17:33.407 "num_base_bdevs_discovered": 2, 00:17:33.407 "num_base_bdevs_operational": 3, 00:17:33.407 "base_bdevs_list": [ 00:17:33.407 { 00:17:33.407 "name": "BaseBdev1", 00:17:33.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.407 "is_configured": false, 00:17:33.407 "data_offset": 0, 00:17:33.407 "data_size": 0 00:17:33.407 }, 00:17:33.407 { 00:17:33.407 "name": "BaseBdev2", 00:17:33.407 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:33.407 "is_configured": true, 00:17:33.407 "data_offset": 2048, 00:17:33.407 "data_size": 63488 00:17:33.407 }, 00:17:33.407 { 00:17:33.407 "name": "BaseBdev3", 00:17:33.407 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:33.407 "is_configured": true, 00:17:33.407 "data_offset": 2048, 00:17:33.407 "data_size": 63488 00:17:33.407 } 00:17:33.407 ] 00:17:33.407 }' 00:17:33.407 10:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.407 10:25:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:34.012 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:34.269 [2024-07-15 10:25:11.410194] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:34.269 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:34.269 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:34.270 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.527 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.527 "name": "Existed_Raid", 00:17:34.527 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:34.527 "strip_size_kb": 0, 00:17:34.527 "state": "configuring", 00:17:34.527 "raid_level": "raid1", 00:17:34.527 "superblock": true, 00:17:34.527 "num_base_bdevs": 3, 00:17:34.527 "num_base_bdevs_discovered": 1, 00:17:34.527 "num_base_bdevs_operational": 3, 00:17:34.527 "base_bdevs_list": [ 00:17:34.527 { 00:17:34.527 "name": "BaseBdev1", 00:17:34.527 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:34.527 "is_configured": false, 00:17:34.527 "data_offset": 0, 00:17:34.527 "data_size": 0 00:17:34.527 }, 00:17:34.527 { 00:17:34.527 "name": null, 00:17:34.527 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:34.527 "is_configured": false, 00:17:34.527 "data_offset": 2048, 00:17:34.527 "data_size": 63488 00:17:34.527 }, 00:17:34.527 { 00:17:34.527 "name": "BaseBdev3", 00:17:34.527 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:34.527 "is_configured": true, 00:17:34.527 "data_offset": 2048, 00:17:34.527 "data_size": 63488 00:17:34.527 } 00:17:34.527 ] 00:17:34.527 }' 00:17:34.527 10:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.527 10:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:35.460 10:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.460 10:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:35.719 10:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:35.719 10:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:35.977 [2024-07-15 10:25:13.043111] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:35.977 BaseBdev1 00:17:35.977 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:35.977 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:35.977 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:35.977 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:35.977 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:35.977 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:35.977 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:36.245 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:36.504 [ 00:17:36.504 { 00:17:36.504 "name": "BaseBdev1", 00:17:36.504 "aliases": [ 00:17:36.504 "f7a84150-b83c-456a-8c47-c758e5e3ecfa" 00:17:36.504 ], 00:17:36.504 "product_name": "Malloc disk", 00:17:36.504 "block_size": 512, 00:17:36.504 "num_blocks": 65536, 00:17:36.504 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:36.504 "assigned_rate_limits": { 00:17:36.504 "rw_ios_per_sec": 0, 00:17:36.504 "rw_mbytes_per_sec": 0, 00:17:36.504 "r_mbytes_per_sec": 0, 00:17:36.504 "w_mbytes_per_sec": 0 00:17:36.504 }, 00:17:36.504 "claimed": true, 00:17:36.504 "claim_type": "exclusive_write", 00:17:36.504 "zoned": false, 00:17:36.504 "supported_io_types": { 00:17:36.504 "read": true, 00:17:36.504 "write": true, 00:17:36.504 "unmap": true, 00:17:36.504 "flush": true, 00:17:36.504 "reset": true, 00:17:36.504 "nvme_admin": false, 00:17:36.504 "nvme_io": false, 00:17:36.504 "nvme_io_md": false, 00:17:36.504 "write_zeroes": true, 00:17:36.504 "zcopy": true, 00:17:36.504 "get_zone_info": false, 00:17:36.504 "zone_management": false, 00:17:36.504 "zone_append": false, 00:17:36.504 "compare": false, 00:17:36.504 "compare_and_write": false, 00:17:36.504 "abort": true, 00:17:36.504 "seek_hole": false, 00:17:36.504 "seek_data": false, 00:17:36.504 "copy": true, 00:17:36.504 "nvme_iov_md": false 00:17:36.504 }, 00:17:36.504 "memory_domains": [ 00:17:36.504 { 00:17:36.504 "dma_device_id": "system", 00:17:36.504 "dma_device_type": 1 00:17:36.504 }, 00:17:36.504 { 00:17:36.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.504 "dma_device_type": 2 00:17:36.504 } 00:17:36.504 ], 00:17:36.504 "driver_specific": {} 00:17:36.504 } 00:17:36.504 ] 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.504 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:36.762 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.762 "name": "Existed_Raid", 00:17:36.762 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:36.762 "strip_size_kb": 0, 00:17:36.762 "state": "configuring", 00:17:36.762 "raid_level": "raid1", 00:17:36.762 "superblock": true, 00:17:36.762 "num_base_bdevs": 3, 00:17:36.762 "num_base_bdevs_discovered": 2, 00:17:36.762 "num_base_bdevs_operational": 3, 00:17:36.762 "base_bdevs_list": [ 00:17:36.762 { 00:17:36.762 "name": "BaseBdev1", 00:17:36.762 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:36.762 "is_configured": true, 00:17:36.762 "data_offset": 2048, 00:17:36.762 "data_size": 63488 00:17:36.762 }, 00:17:36.762 { 00:17:36.762 "name": null, 00:17:36.762 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:36.762 "is_configured": false, 00:17:36.762 "data_offset": 2048, 00:17:36.762 "data_size": 63488 00:17:36.762 }, 00:17:36.762 { 00:17:36.762 "name": "BaseBdev3", 00:17:36.762 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:36.762 "is_configured": true, 00:17:36.762 "data_offset": 2048, 00:17:36.762 "data_size": 63488 00:17:36.762 } 00:17:36.762 ] 00:17:36.762 }' 00:17:36.762 10:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.762 10:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:37.327 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.327 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:37.585 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:37.585 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:37.843 [2024-07-15 10:25:14.851950] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.843 10:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.101 10:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.102 "name": "Existed_Raid", 00:17:38.102 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:38.102 "strip_size_kb": 0, 00:17:38.102 "state": "configuring", 00:17:38.102 "raid_level": "raid1", 00:17:38.102 "superblock": true, 00:17:38.102 "num_base_bdevs": 3, 00:17:38.102 "num_base_bdevs_discovered": 1, 00:17:38.102 "num_base_bdevs_operational": 3, 00:17:38.102 "base_bdevs_list": [ 00:17:38.102 { 00:17:38.102 "name": "BaseBdev1", 00:17:38.102 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:38.102 "is_configured": true, 00:17:38.102 "data_offset": 2048, 00:17:38.102 "data_size": 63488 00:17:38.102 }, 00:17:38.102 { 00:17:38.102 "name": null, 00:17:38.102 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:38.102 "is_configured": false, 00:17:38.102 "data_offset": 2048, 00:17:38.102 "data_size": 63488 00:17:38.102 }, 00:17:38.102 { 00:17:38.102 "name": null, 00:17:38.102 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:38.102 "is_configured": false, 00:17:38.102 "data_offset": 2048, 00:17:38.102 "data_size": 63488 00:17:38.102 } 00:17:38.102 ] 00:17:38.102 }' 00:17:38.102 10:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.102 10:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:38.668 10:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.668 10:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:38.926 10:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:38.926 10:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:39.185 [2024-07-15 10:25:16.179496] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.185 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:39.443 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:39.443 "name": "Existed_Raid", 00:17:39.443 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:39.443 "strip_size_kb": 0, 00:17:39.443 "state": "configuring", 00:17:39.443 "raid_level": "raid1", 00:17:39.443 "superblock": true, 00:17:39.443 "num_base_bdevs": 3, 00:17:39.443 "num_base_bdevs_discovered": 2, 00:17:39.443 "num_base_bdevs_operational": 3, 00:17:39.443 "base_bdevs_list": [ 00:17:39.443 { 00:17:39.443 "name": "BaseBdev1", 00:17:39.443 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:39.443 "is_configured": true, 00:17:39.444 "data_offset": 2048, 00:17:39.444 "data_size": 63488 00:17:39.444 }, 00:17:39.444 { 00:17:39.444 "name": null, 00:17:39.444 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:39.444 "is_configured": false, 00:17:39.444 "data_offset": 2048, 00:17:39.444 "data_size": 63488 00:17:39.444 }, 00:17:39.444 { 00:17:39.444 "name": "BaseBdev3", 00:17:39.444 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:39.444 "is_configured": true, 00:17:39.444 "data_offset": 2048, 00:17:39.444 "data_size": 63488 00:17:39.444 } 00:17:39.444 ] 00:17:39.444 }' 00:17:39.444 10:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:39.444 10:25:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:40.010 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.010 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:40.268 [2024-07-15 10:25:17.394727] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.268 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.269 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.527 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.527 "name": "Existed_Raid", 00:17:40.527 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:40.527 "strip_size_kb": 0, 00:17:40.527 "state": "configuring", 00:17:40.527 "raid_level": "raid1", 00:17:40.527 "superblock": true, 00:17:40.527 "num_base_bdevs": 3, 00:17:40.527 "num_base_bdevs_discovered": 1, 00:17:40.527 "num_base_bdevs_operational": 3, 00:17:40.527 "base_bdevs_list": [ 00:17:40.527 { 00:17:40.527 "name": null, 00:17:40.527 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:40.527 "is_configured": false, 00:17:40.527 "data_offset": 2048, 00:17:40.527 "data_size": 63488 00:17:40.527 }, 00:17:40.527 { 00:17:40.527 "name": null, 00:17:40.527 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:40.527 "is_configured": false, 00:17:40.527 "data_offset": 2048, 00:17:40.527 "data_size": 63488 00:17:40.527 }, 00:17:40.527 { 00:17:40.527 "name": "BaseBdev3", 00:17:40.527 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:40.527 "is_configured": true, 00:17:40.527 "data_offset": 2048, 00:17:40.527 "data_size": 63488 00:17:40.527 } 00:17:40.527 ] 00:17:40.527 }' 00:17:40.527 10:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.527 10:25:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:41.092 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.092 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:41.350 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:41.350 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:41.608 [2024-07-15 10:25:18.588481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.608 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.866 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.866 "name": "Existed_Raid", 00:17:41.866 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:41.866 "strip_size_kb": 0, 00:17:41.866 "state": "configuring", 00:17:41.866 "raid_level": "raid1", 00:17:41.866 "superblock": true, 00:17:41.866 "num_base_bdevs": 3, 00:17:41.866 "num_base_bdevs_discovered": 2, 00:17:41.866 "num_base_bdevs_operational": 3, 00:17:41.866 "base_bdevs_list": [ 00:17:41.866 { 00:17:41.866 "name": null, 00:17:41.866 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:41.866 "is_configured": false, 00:17:41.866 "data_offset": 2048, 00:17:41.866 "data_size": 63488 00:17:41.866 }, 00:17:41.866 { 00:17:41.866 "name": "BaseBdev2", 00:17:41.866 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:41.866 "is_configured": true, 00:17:41.866 "data_offset": 2048, 00:17:41.866 "data_size": 63488 00:17:41.866 }, 00:17:41.866 { 00:17:41.866 "name": "BaseBdev3", 00:17:41.866 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:41.866 "is_configured": true, 00:17:41.866 "data_offset": 2048, 00:17:41.866 "data_size": 63488 00:17:41.866 } 00:17:41.866 ] 00:17:41.866 }' 00:17:41.866 10:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.866 10:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:42.432 10:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.432 10:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:42.690 10:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:42.690 10:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.690 10:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:42.948 10:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f7a84150-b83c-456a-8c47-c758e5e3ecfa 00:17:43.207 [2024-07-15 10:25:20.189266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:43.207 [2024-07-15 10:25:20.189422] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x19651b0 00:17:43.207 [2024-07-15 10:25:20.189435] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:43.207 [2024-07-15 10:25:20.189613] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b214f0 00:17:43.207 [2024-07-15 10:25:20.189736] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x19651b0 00:17:43.207 [2024-07-15 10:25:20.189747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x19651b0 00:17:43.207 [2024-07-15 10:25:20.189840] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.207 NewBaseBdev 00:17:43.207 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:43.207 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:43.207 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:43.207 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:43.207 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:43.207 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:43.207 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.465 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:43.724 [ 00:17:43.724 { 00:17:43.724 "name": "NewBaseBdev", 00:17:43.724 "aliases": [ 00:17:43.724 "f7a84150-b83c-456a-8c47-c758e5e3ecfa" 00:17:43.724 ], 00:17:43.724 "product_name": "Malloc disk", 00:17:43.724 "block_size": 512, 00:17:43.724 "num_blocks": 65536, 00:17:43.724 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:43.724 "assigned_rate_limits": { 00:17:43.724 "rw_ios_per_sec": 0, 00:17:43.724 "rw_mbytes_per_sec": 0, 00:17:43.724 "r_mbytes_per_sec": 0, 00:17:43.724 "w_mbytes_per_sec": 0 00:17:43.724 }, 00:17:43.724 "claimed": true, 00:17:43.724 "claim_type": "exclusive_write", 00:17:43.724 "zoned": false, 00:17:43.724 "supported_io_types": { 00:17:43.724 "read": true, 00:17:43.724 "write": true, 00:17:43.724 "unmap": true, 00:17:43.724 "flush": true, 00:17:43.724 "reset": true, 00:17:43.724 "nvme_admin": false, 00:17:43.724 "nvme_io": false, 00:17:43.724 "nvme_io_md": false, 00:17:43.724 "write_zeroes": true, 00:17:43.724 "zcopy": true, 00:17:43.724 "get_zone_info": false, 00:17:43.724 "zone_management": false, 00:17:43.724 "zone_append": false, 00:17:43.724 "compare": false, 00:17:43.724 "compare_and_write": false, 00:17:43.724 "abort": true, 00:17:43.724 "seek_hole": false, 00:17:43.724 "seek_data": false, 00:17:43.724 "copy": true, 00:17:43.724 "nvme_iov_md": false 00:17:43.724 }, 00:17:43.724 "memory_domains": [ 00:17:43.724 { 00:17:43.724 "dma_device_id": "system", 00:17:43.724 "dma_device_type": 1 00:17:43.724 }, 00:17:43.724 { 00:17:43.724 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:43.724 "dma_device_type": 2 00:17:43.724 } 00:17:43.724 ], 00:17:43.724 "driver_specific": {} 00:17:43.724 } 00:17:43.724 ] 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:43.724 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.725 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.725 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.725 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.725 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.725 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:43.983 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.983 "name": "Existed_Raid", 00:17:43.983 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:43.983 "strip_size_kb": 0, 00:17:43.983 "state": "online", 00:17:43.983 "raid_level": "raid1", 00:17:43.983 "superblock": true, 00:17:43.983 "num_base_bdevs": 3, 00:17:43.983 "num_base_bdevs_discovered": 3, 00:17:43.983 "num_base_bdevs_operational": 3, 00:17:43.983 "base_bdevs_list": [ 00:17:43.983 { 00:17:43.983 "name": "NewBaseBdev", 00:17:43.983 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:43.983 "is_configured": true, 00:17:43.983 "data_offset": 2048, 00:17:43.983 "data_size": 63488 00:17:43.983 }, 00:17:43.983 { 00:17:43.983 "name": "BaseBdev2", 00:17:43.983 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:43.983 "is_configured": true, 00:17:43.983 "data_offset": 2048, 00:17:43.983 "data_size": 63488 00:17:43.983 }, 00:17:43.983 { 00:17:43.983 "name": "BaseBdev3", 00:17:43.983 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:43.983 "is_configured": true, 00:17:43.983 "data_offset": 2048, 00:17:43.983 "data_size": 63488 00:17:43.983 } 00:17:43.983 ] 00:17:43.983 }' 00:17:43.983 10:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.983 10:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:44.549 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:44.807 [2024-07-15 10:25:21.749695] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:44.807 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:44.807 "name": "Existed_Raid", 00:17:44.807 "aliases": [ 00:17:44.807 "ac08fc2c-7bef-4839-8a1f-b0c539090384" 00:17:44.807 ], 00:17:44.807 "product_name": "Raid Volume", 00:17:44.807 "block_size": 512, 00:17:44.807 "num_blocks": 63488, 00:17:44.807 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:44.807 "assigned_rate_limits": { 00:17:44.807 "rw_ios_per_sec": 0, 00:17:44.807 "rw_mbytes_per_sec": 0, 00:17:44.807 "r_mbytes_per_sec": 0, 00:17:44.807 "w_mbytes_per_sec": 0 00:17:44.807 }, 00:17:44.807 "claimed": false, 00:17:44.807 "zoned": false, 00:17:44.807 "supported_io_types": { 00:17:44.807 "read": true, 00:17:44.807 "write": true, 00:17:44.807 "unmap": false, 00:17:44.807 "flush": false, 00:17:44.807 "reset": true, 00:17:44.807 "nvme_admin": false, 00:17:44.807 "nvme_io": false, 00:17:44.807 "nvme_io_md": false, 00:17:44.807 "write_zeroes": true, 00:17:44.807 "zcopy": false, 00:17:44.807 "get_zone_info": false, 00:17:44.807 "zone_management": false, 00:17:44.807 "zone_append": false, 00:17:44.807 "compare": false, 00:17:44.807 "compare_and_write": false, 00:17:44.807 "abort": false, 00:17:44.807 "seek_hole": false, 00:17:44.807 "seek_data": false, 00:17:44.807 "copy": false, 00:17:44.807 "nvme_iov_md": false 00:17:44.807 }, 00:17:44.807 "memory_domains": [ 00:17:44.807 { 00:17:44.807 "dma_device_id": "system", 00:17:44.807 "dma_device_type": 1 00:17:44.807 }, 00:17:44.807 { 00:17:44.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.807 "dma_device_type": 2 00:17:44.807 }, 00:17:44.807 { 00:17:44.807 "dma_device_id": "system", 00:17:44.807 "dma_device_type": 1 00:17:44.807 }, 00:17:44.807 { 00:17:44.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.807 "dma_device_type": 2 00:17:44.807 }, 00:17:44.807 { 00:17:44.807 "dma_device_id": "system", 00:17:44.807 "dma_device_type": 1 00:17:44.807 }, 00:17:44.807 { 00:17:44.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.807 "dma_device_type": 2 00:17:44.807 } 00:17:44.807 ], 00:17:44.807 "driver_specific": { 00:17:44.807 "raid": { 00:17:44.807 "uuid": "ac08fc2c-7bef-4839-8a1f-b0c539090384", 00:17:44.807 "strip_size_kb": 0, 00:17:44.807 "state": "online", 00:17:44.807 "raid_level": "raid1", 00:17:44.807 "superblock": true, 00:17:44.807 "num_base_bdevs": 3, 00:17:44.807 "num_base_bdevs_discovered": 3, 00:17:44.807 "num_base_bdevs_operational": 3, 00:17:44.807 "base_bdevs_list": [ 00:17:44.807 { 00:17:44.807 "name": "NewBaseBdev", 00:17:44.807 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:44.807 "is_configured": true, 00:17:44.807 "data_offset": 2048, 00:17:44.807 "data_size": 63488 00:17:44.807 }, 00:17:44.807 { 00:17:44.807 "name": "BaseBdev2", 00:17:44.807 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:44.807 "is_configured": true, 00:17:44.807 "data_offset": 2048, 00:17:44.807 "data_size": 63488 00:17:44.807 }, 00:17:44.807 { 00:17:44.807 "name": "BaseBdev3", 00:17:44.807 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:44.807 "is_configured": true, 00:17:44.807 "data_offset": 2048, 00:17:44.807 "data_size": 63488 00:17:44.807 } 00:17:44.807 ] 00:17:44.807 } 00:17:44.807 } 00:17:44.807 }' 00:17:44.807 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:44.807 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:44.807 BaseBdev2 00:17:44.807 BaseBdev3' 00:17:44.807 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:44.807 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:44.807 10:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.066 "name": "NewBaseBdev", 00:17:45.066 "aliases": [ 00:17:45.066 "f7a84150-b83c-456a-8c47-c758e5e3ecfa" 00:17:45.066 ], 00:17:45.066 "product_name": "Malloc disk", 00:17:45.066 "block_size": 512, 00:17:45.066 "num_blocks": 65536, 00:17:45.066 "uuid": "f7a84150-b83c-456a-8c47-c758e5e3ecfa", 00:17:45.066 "assigned_rate_limits": { 00:17:45.066 "rw_ios_per_sec": 0, 00:17:45.066 "rw_mbytes_per_sec": 0, 00:17:45.066 "r_mbytes_per_sec": 0, 00:17:45.066 "w_mbytes_per_sec": 0 00:17:45.066 }, 00:17:45.066 "claimed": true, 00:17:45.066 "claim_type": "exclusive_write", 00:17:45.066 "zoned": false, 00:17:45.066 "supported_io_types": { 00:17:45.066 "read": true, 00:17:45.066 "write": true, 00:17:45.066 "unmap": true, 00:17:45.066 "flush": true, 00:17:45.066 "reset": true, 00:17:45.066 "nvme_admin": false, 00:17:45.066 "nvme_io": false, 00:17:45.066 "nvme_io_md": false, 00:17:45.066 "write_zeroes": true, 00:17:45.066 "zcopy": true, 00:17:45.066 "get_zone_info": false, 00:17:45.066 "zone_management": false, 00:17:45.066 "zone_append": false, 00:17:45.066 "compare": false, 00:17:45.066 "compare_and_write": false, 00:17:45.066 "abort": true, 00:17:45.066 "seek_hole": false, 00:17:45.066 "seek_data": false, 00:17:45.066 "copy": true, 00:17:45.066 "nvme_iov_md": false 00:17:45.066 }, 00:17:45.066 "memory_domains": [ 00:17:45.066 { 00:17:45.066 "dma_device_id": "system", 00:17:45.066 "dma_device_type": 1 00:17:45.066 }, 00:17:45.066 { 00:17:45.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.066 "dma_device_type": 2 00:17:45.066 } 00:17:45.066 ], 00:17:45.066 "driver_specific": {} 00:17:45.066 }' 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.066 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:45.324 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.583 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.583 "name": "BaseBdev2", 00:17:45.583 "aliases": [ 00:17:45.583 "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157" 00:17:45.583 ], 00:17:45.583 "product_name": "Malloc disk", 00:17:45.583 "block_size": 512, 00:17:45.583 "num_blocks": 65536, 00:17:45.583 "uuid": "8e8cfbdf-ec7c-4f5f-ac57-b34615bb3157", 00:17:45.583 "assigned_rate_limits": { 00:17:45.583 "rw_ios_per_sec": 0, 00:17:45.583 "rw_mbytes_per_sec": 0, 00:17:45.583 "r_mbytes_per_sec": 0, 00:17:45.583 "w_mbytes_per_sec": 0 00:17:45.583 }, 00:17:45.583 "claimed": true, 00:17:45.583 "claim_type": "exclusive_write", 00:17:45.583 "zoned": false, 00:17:45.583 "supported_io_types": { 00:17:45.583 "read": true, 00:17:45.583 "write": true, 00:17:45.583 "unmap": true, 00:17:45.583 "flush": true, 00:17:45.583 "reset": true, 00:17:45.583 "nvme_admin": false, 00:17:45.583 "nvme_io": false, 00:17:45.583 "nvme_io_md": false, 00:17:45.583 "write_zeroes": true, 00:17:45.583 "zcopy": true, 00:17:45.583 "get_zone_info": false, 00:17:45.583 "zone_management": false, 00:17:45.583 "zone_append": false, 00:17:45.583 "compare": false, 00:17:45.583 "compare_and_write": false, 00:17:45.583 "abort": true, 00:17:45.583 "seek_hole": false, 00:17:45.583 "seek_data": false, 00:17:45.583 "copy": true, 00:17:45.583 "nvme_iov_md": false 00:17:45.583 }, 00:17:45.583 "memory_domains": [ 00:17:45.583 { 00:17:45.583 "dma_device_id": "system", 00:17:45.583 "dma_device_type": 1 00:17:45.583 }, 00:17:45.583 { 00:17:45.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.583 "dma_device_type": 2 00:17:45.583 } 00:17:45.583 ], 00:17:45.583 "driver_specific": {} 00:17:45.583 }' 00:17:45.583 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.583 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.583 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.583 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.841 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.841 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.841 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.841 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.841 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.841 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.841 10:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.841 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.841 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.841 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:45.841 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.098 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.098 "name": "BaseBdev3", 00:17:46.098 "aliases": [ 00:17:46.098 "d3860442-ae51-4507-bd9b-b4657e3e56c5" 00:17:46.098 ], 00:17:46.098 "product_name": "Malloc disk", 00:17:46.098 "block_size": 512, 00:17:46.098 "num_blocks": 65536, 00:17:46.098 "uuid": "d3860442-ae51-4507-bd9b-b4657e3e56c5", 00:17:46.098 "assigned_rate_limits": { 00:17:46.098 "rw_ios_per_sec": 0, 00:17:46.098 "rw_mbytes_per_sec": 0, 00:17:46.098 "r_mbytes_per_sec": 0, 00:17:46.098 "w_mbytes_per_sec": 0 00:17:46.098 }, 00:17:46.098 "claimed": true, 00:17:46.098 "claim_type": "exclusive_write", 00:17:46.098 "zoned": false, 00:17:46.098 "supported_io_types": { 00:17:46.098 "read": true, 00:17:46.098 "write": true, 00:17:46.098 "unmap": true, 00:17:46.098 "flush": true, 00:17:46.098 "reset": true, 00:17:46.098 "nvme_admin": false, 00:17:46.098 "nvme_io": false, 00:17:46.098 "nvme_io_md": false, 00:17:46.098 "write_zeroes": true, 00:17:46.098 "zcopy": true, 00:17:46.099 "get_zone_info": false, 00:17:46.099 "zone_management": false, 00:17:46.099 "zone_append": false, 00:17:46.099 "compare": false, 00:17:46.099 "compare_and_write": false, 00:17:46.099 "abort": true, 00:17:46.099 "seek_hole": false, 00:17:46.099 "seek_data": false, 00:17:46.099 "copy": true, 00:17:46.099 "nvme_iov_md": false 00:17:46.099 }, 00:17:46.099 "memory_domains": [ 00:17:46.099 { 00:17:46.099 "dma_device_id": "system", 00:17:46.099 "dma_device_type": 1 00:17:46.099 }, 00:17:46.099 { 00:17:46.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.099 "dma_device_type": 2 00:17:46.099 } 00:17:46.099 ], 00:17:46.099 "driver_specific": {} 00:17:46.099 }' 00:17:46.099 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.357 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.616 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.616 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.616 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:46.875 [2024-07-15 10:25:23.830958] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:46.875 [2024-07-15 10:25:23.830985] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:46.875 [2024-07-15 10:25:23.831036] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:46.875 [2024-07-15 10:25:23.831311] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:46.875 [2024-07-15 10:25:23.831324] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x19651b0 name Existed_Raid, state offline 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 525469 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 525469 ']' 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 525469 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 525469 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 525469' 00:17:46.875 killing process with pid 525469 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 525469 00:17:46.875 [2024-07-15 10:25:23.903353] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:46.875 10:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 525469 00:17:46.875 [2024-07-15 10:25:23.930460] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:47.134 10:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:47.134 00:17:47.134 real 0m29.187s 00:17:47.134 user 0m53.534s 00:17:47.134 sys 0m5.185s 00:17:47.134 10:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:47.134 10:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:47.134 ************************************ 00:17:47.134 END TEST raid_state_function_test_sb 00:17:47.134 ************************************ 00:17:47.134 10:25:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:47.134 10:25:24 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:17:47.134 10:25:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:47.134 10:25:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:47.134 10:25:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:47.134 ************************************ 00:17:47.134 START TEST raid_superblock_test 00:17:47.134 ************************************ 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=529921 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 529921 /var/tmp/spdk-raid.sock 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 529921 ']' 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:47.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:47.134 10:25:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.134 [2024-07-15 10:25:24.309671] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:17:47.134 [2024-07-15 10:25:24.309743] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid529921 ] 00:17:47.449 [2024-07-15 10:25:24.440343] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:47.449 [2024-07-15 10:25:24.542952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:47.449 [2024-07-15 10:25:24.605533] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:47.449 [2024-07-15 10:25:24.605561] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:48.426 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:48.427 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:48.427 malloc1 00:17:48.427 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:48.684 [2024-07-15 10:25:25.654036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:48.684 [2024-07-15 10:25:25.654085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.684 [2024-07-15 10:25:25.654105] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfec570 00:17:48.684 [2024-07-15 10:25:25.654118] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.684 [2024-07-15 10:25:25.655700] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.684 [2024-07-15 10:25:25.655730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:48.684 pt1 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:48.684 malloc2 00:17:48.684 10:25:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:48.941 [2024-07-15 10:25:26.087921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:48.941 [2024-07-15 10:25:26.087973] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.941 [2024-07-15 10:25:26.087990] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfed970 00:17:48.941 [2024-07-15 10:25:26.088002] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.941 [2024-07-15 10:25:26.089463] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.941 [2024-07-15 10:25:26.089492] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:48.941 pt2 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:48.941 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:49.198 malloc3 00:17:49.198 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:49.456 [2024-07-15 10:25:26.589975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:49.456 [2024-07-15 10:25:26.590028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:49.456 [2024-07-15 10:25:26.590045] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1184340 00:17:49.456 [2024-07-15 10:25:26.590057] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:49.456 [2024-07-15 10:25:26.591542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:49.456 [2024-07-15 10:25:26.591571] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:49.456 pt3 00:17:49.456 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:49.456 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:49.456 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:49.713 [2024-07-15 10:25:26.766444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:49.713 [2024-07-15 10:25:26.767665] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:49.713 [2024-07-15 10:25:26.767719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:49.713 [2024-07-15 10:25:26.767864] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfe4ea0 00:17:49.713 [2024-07-15 10:25:26.767875] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:49.713 [2024-07-15 10:25:26.768074] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfec240 00:17:49.713 [2024-07-15 10:25:26.768233] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfe4ea0 00:17:49.713 [2024-07-15 10:25:26.768244] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfe4ea0 00:17:49.713 [2024-07-15 10:25:26.768338] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.713 10:25:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:49.969 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:49.969 "name": "raid_bdev1", 00:17:49.969 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:49.969 "strip_size_kb": 0, 00:17:49.969 "state": "online", 00:17:49.969 "raid_level": "raid1", 00:17:49.969 "superblock": true, 00:17:49.969 "num_base_bdevs": 3, 00:17:49.969 "num_base_bdevs_discovered": 3, 00:17:49.969 "num_base_bdevs_operational": 3, 00:17:49.969 "base_bdevs_list": [ 00:17:49.969 { 00:17:49.969 "name": "pt1", 00:17:49.969 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:49.969 "is_configured": true, 00:17:49.969 "data_offset": 2048, 00:17:49.969 "data_size": 63488 00:17:49.969 }, 00:17:49.969 { 00:17:49.969 "name": "pt2", 00:17:49.969 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:49.970 "is_configured": true, 00:17:49.970 "data_offset": 2048, 00:17:49.970 "data_size": 63488 00:17:49.970 }, 00:17:49.970 { 00:17:49.970 "name": "pt3", 00:17:49.970 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:49.970 "is_configured": true, 00:17:49.970 "data_offset": 2048, 00:17:49.970 "data_size": 63488 00:17:49.970 } 00:17:49.970 ] 00:17:49.970 }' 00:17:49.970 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:49.970 10:25:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:50.533 10:25:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:51.098 [2024-07-15 10:25:28.106288] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:51.098 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:51.098 "name": "raid_bdev1", 00:17:51.098 "aliases": [ 00:17:51.098 "2eb95585-1065-4088-8aaa-6be7901e002c" 00:17:51.098 ], 00:17:51.098 "product_name": "Raid Volume", 00:17:51.098 "block_size": 512, 00:17:51.098 "num_blocks": 63488, 00:17:51.098 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:51.098 "assigned_rate_limits": { 00:17:51.098 "rw_ios_per_sec": 0, 00:17:51.098 "rw_mbytes_per_sec": 0, 00:17:51.098 "r_mbytes_per_sec": 0, 00:17:51.098 "w_mbytes_per_sec": 0 00:17:51.098 }, 00:17:51.098 "claimed": false, 00:17:51.098 "zoned": false, 00:17:51.098 "supported_io_types": { 00:17:51.098 "read": true, 00:17:51.098 "write": true, 00:17:51.098 "unmap": false, 00:17:51.098 "flush": false, 00:17:51.098 "reset": true, 00:17:51.098 "nvme_admin": false, 00:17:51.098 "nvme_io": false, 00:17:51.098 "nvme_io_md": false, 00:17:51.098 "write_zeroes": true, 00:17:51.098 "zcopy": false, 00:17:51.098 "get_zone_info": false, 00:17:51.098 "zone_management": false, 00:17:51.098 "zone_append": false, 00:17:51.098 "compare": false, 00:17:51.098 "compare_and_write": false, 00:17:51.098 "abort": false, 00:17:51.098 "seek_hole": false, 00:17:51.098 "seek_data": false, 00:17:51.098 "copy": false, 00:17:51.098 "nvme_iov_md": false 00:17:51.098 }, 00:17:51.098 "memory_domains": [ 00:17:51.098 { 00:17:51.098 "dma_device_id": "system", 00:17:51.098 "dma_device_type": 1 00:17:51.098 }, 00:17:51.098 { 00:17:51.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.098 "dma_device_type": 2 00:17:51.098 }, 00:17:51.098 { 00:17:51.098 "dma_device_id": "system", 00:17:51.098 "dma_device_type": 1 00:17:51.098 }, 00:17:51.098 { 00:17:51.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.098 "dma_device_type": 2 00:17:51.098 }, 00:17:51.098 { 00:17:51.098 "dma_device_id": "system", 00:17:51.098 "dma_device_type": 1 00:17:51.098 }, 00:17:51.098 { 00:17:51.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.098 "dma_device_type": 2 00:17:51.098 } 00:17:51.098 ], 00:17:51.098 "driver_specific": { 00:17:51.098 "raid": { 00:17:51.098 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:51.098 "strip_size_kb": 0, 00:17:51.098 "state": "online", 00:17:51.098 "raid_level": "raid1", 00:17:51.098 "superblock": true, 00:17:51.098 "num_base_bdevs": 3, 00:17:51.098 "num_base_bdevs_discovered": 3, 00:17:51.098 "num_base_bdevs_operational": 3, 00:17:51.098 "base_bdevs_list": [ 00:17:51.098 { 00:17:51.098 "name": "pt1", 00:17:51.098 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:51.098 "is_configured": true, 00:17:51.098 "data_offset": 2048, 00:17:51.098 "data_size": 63488 00:17:51.098 }, 00:17:51.098 { 00:17:51.098 "name": "pt2", 00:17:51.098 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:51.098 "is_configured": true, 00:17:51.098 "data_offset": 2048, 00:17:51.098 "data_size": 63488 00:17:51.098 }, 00:17:51.098 { 00:17:51.098 "name": "pt3", 00:17:51.098 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:51.098 "is_configured": true, 00:17:51.098 "data_offset": 2048, 00:17:51.098 "data_size": 63488 00:17:51.098 } 00:17:51.098 ] 00:17:51.098 } 00:17:51.098 } 00:17:51.098 }' 00:17:51.098 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:51.098 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:51.098 pt2 00:17:51.098 pt3' 00:17:51.098 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:51.098 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:51.098 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:51.356 "name": "pt1", 00:17:51.356 "aliases": [ 00:17:51.356 "00000000-0000-0000-0000-000000000001" 00:17:51.356 ], 00:17:51.356 "product_name": "passthru", 00:17:51.356 "block_size": 512, 00:17:51.356 "num_blocks": 65536, 00:17:51.356 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:51.356 "assigned_rate_limits": { 00:17:51.356 "rw_ios_per_sec": 0, 00:17:51.356 "rw_mbytes_per_sec": 0, 00:17:51.356 "r_mbytes_per_sec": 0, 00:17:51.356 "w_mbytes_per_sec": 0 00:17:51.356 }, 00:17:51.356 "claimed": true, 00:17:51.356 "claim_type": "exclusive_write", 00:17:51.356 "zoned": false, 00:17:51.356 "supported_io_types": { 00:17:51.356 "read": true, 00:17:51.356 "write": true, 00:17:51.356 "unmap": true, 00:17:51.356 "flush": true, 00:17:51.356 "reset": true, 00:17:51.356 "nvme_admin": false, 00:17:51.356 "nvme_io": false, 00:17:51.356 "nvme_io_md": false, 00:17:51.356 "write_zeroes": true, 00:17:51.356 "zcopy": true, 00:17:51.356 "get_zone_info": false, 00:17:51.356 "zone_management": false, 00:17:51.356 "zone_append": false, 00:17:51.356 "compare": false, 00:17:51.356 "compare_and_write": false, 00:17:51.356 "abort": true, 00:17:51.356 "seek_hole": false, 00:17:51.356 "seek_data": false, 00:17:51.356 "copy": true, 00:17:51.356 "nvme_iov_md": false 00:17:51.356 }, 00:17:51.356 "memory_domains": [ 00:17:51.356 { 00:17:51.356 "dma_device_id": "system", 00:17:51.356 "dma_device_type": 1 00:17:51.356 }, 00:17:51.356 { 00:17:51.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.356 "dma_device_type": 2 00:17:51.356 } 00:17:51.356 ], 00:17:51.356 "driver_specific": { 00:17:51.356 "passthru": { 00:17:51.356 "name": "pt1", 00:17:51.356 "base_bdev_name": "malloc1" 00:17:51.356 } 00:17:51.356 } 00:17:51.356 }' 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:51.356 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:51.615 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:51.874 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:51.874 "name": "pt2", 00:17:51.874 "aliases": [ 00:17:51.874 "00000000-0000-0000-0000-000000000002" 00:17:51.874 ], 00:17:51.874 "product_name": "passthru", 00:17:51.874 "block_size": 512, 00:17:51.874 "num_blocks": 65536, 00:17:51.874 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:51.874 "assigned_rate_limits": { 00:17:51.874 "rw_ios_per_sec": 0, 00:17:51.874 "rw_mbytes_per_sec": 0, 00:17:51.874 "r_mbytes_per_sec": 0, 00:17:51.874 "w_mbytes_per_sec": 0 00:17:51.874 }, 00:17:51.874 "claimed": true, 00:17:51.874 "claim_type": "exclusive_write", 00:17:51.874 "zoned": false, 00:17:51.874 "supported_io_types": { 00:17:51.874 "read": true, 00:17:51.874 "write": true, 00:17:51.874 "unmap": true, 00:17:51.874 "flush": true, 00:17:51.874 "reset": true, 00:17:51.874 "nvme_admin": false, 00:17:51.874 "nvme_io": false, 00:17:51.874 "nvme_io_md": false, 00:17:51.874 "write_zeroes": true, 00:17:51.874 "zcopy": true, 00:17:51.874 "get_zone_info": false, 00:17:51.874 "zone_management": false, 00:17:51.874 "zone_append": false, 00:17:51.874 "compare": false, 00:17:51.874 "compare_and_write": false, 00:17:51.874 "abort": true, 00:17:51.874 "seek_hole": false, 00:17:51.874 "seek_data": false, 00:17:51.874 "copy": true, 00:17:51.874 "nvme_iov_md": false 00:17:51.874 }, 00:17:51.874 "memory_domains": [ 00:17:51.874 { 00:17:51.874 "dma_device_id": "system", 00:17:51.874 "dma_device_type": 1 00:17:51.874 }, 00:17:51.874 { 00:17:51.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:51.874 "dma_device_type": 2 00:17:51.874 } 00:17:51.874 ], 00:17:51.874 "driver_specific": { 00:17:51.874 "passthru": { 00:17:51.874 "name": "pt2", 00:17:51.874 "base_bdev_name": "malloc2" 00:17:51.874 } 00:17:51.875 } 00:17:51.875 }' 00:17:51.875 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.875 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:51.875 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:51.875 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.875 10:25:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:51.875 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:51.875 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:51.875 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:52.133 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:52.133 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:52.133 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:52.133 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:52.133 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:52.133 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:52.133 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:52.700 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:52.700 "name": "pt3", 00:17:52.700 "aliases": [ 00:17:52.700 "00000000-0000-0000-0000-000000000003" 00:17:52.700 ], 00:17:52.700 "product_name": "passthru", 00:17:52.700 "block_size": 512, 00:17:52.700 "num_blocks": 65536, 00:17:52.700 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:52.700 "assigned_rate_limits": { 00:17:52.700 "rw_ios_per_sec": 0, 00:17:52.700 "rw_mbytes_per_sec": 0, 00:17:52.700 "r_mbytes_per_sec": 0, 00:17:52.700 "w_mbytes_per_sec": 0 00:17:52.700 }, 00:17:52.700 "claimed": true, 00:17:52.700 "claim_type": "exclusive_write", 00:17:52.700 "zoned": false, 00:17:52.700 "supported_io_types": { 00:17:52.700 "read": true, 00:17:52.700 "write": true, 00:17:52.700 "unmap": true, 00:17:52.700 "flush": true, 00:17:52.700 "reset": true, 00:17:52.700 "nvme_admin": false, 00:17:52.700 "nvme_io": false, 00:17:52.700 "nvme_io_md": false, 00:17:52.700 "write_zeroes": true, 00:17:52.700 "zcopy": true, 00:17:52.700 "get_zone_info": false, 00:17:52.700 "zone_management": false, 00:17:52.700 "zone_append": false, 00:17:52.700 "compare": false, 00:17:52.700 "compare_and_write": false, 00:17:52.700 "abort": true, 00:17:52.700 "seek_hole": false, 00:17:52.700 "seek_data": false, 00:17:52.700 "copy": true, 00:17:52.700 "nvme_iov_md": false 00:17:52.700 }, 00:17:52.701 "memory_domains": [ 00:17:52.701 { 00:17:52.701 "dma_device_id": "system", 00:17:52.701 "dma_device_type": 1 00:17:52.701 }, 00:17:52.701 { 00:17:52.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.701 "dma_device_type": 2 00:17:52.701 } 00:17:52.701 ], 00:17:52.701 "driver_specific": { 00:17:52.701 "passthru": { 00:17:52.701 "name": "pt3", 00:17:52.701 "base_bdev_name": "malloc3" 00:17:52.701 } 00:17:52.701 } 00:17:52.701 }' 00:17:52.701 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:52.701 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:52.701 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:52.701 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:52.701 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:52.701 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:52.701 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:52.960 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:52.960 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:52.960 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:52.960 10:25:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:52.960 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:52.960 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:52.960 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:53.219 [2024-07-15 10:25:30.191782] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:53.219 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2eb95585-1065-4088-8aaa-6be7901e002c 00:17:53.219 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2eb95585-1065-4088-8aaa-6be7901e002c ']' 00:17:53.219 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:53.478 [2024-07-15 10:25:30.436161] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:53.478 [2024-07-15 10:25:30.436187] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:53.478 [2024-07-15 10:25:30.436246] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:53.478 [2024-07-15 10:25:30.436319] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:53.478 [2024-07-15 10:25:30.436333] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfe4ea0 name raid_bdev1, state offline 00:17:53.478 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.478 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:53.737 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:53.737 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:53.737 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:53.737 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:53.995 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:53.995 10:25:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:54.262 10:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:54.262 10:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:54.521 10:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:54.521 10:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:54.778 10:25:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:55.036 [2024-07-15 10:25:32.168674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:55.036 [2024-07-15 10:25:32.170024] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:55.036 [2024-07-15 10:25:32.170068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:55.036 [2024-07-15 10:25:32.170113] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:55.036 [2024-07-15 10:25:32.170153] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:55.036 [2024-07-15 10:25:32.170176] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:55.036 [2024-07-15 10:25:32.170194] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:55.036 [2024-07-15 10:25:32.170204] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x118fff0 name raid_bdev1, state configuring 00:17:55.036 request: 00:17:55.036 { 00:17:55.036 "name": "raid_bdev1", 00:17:55.036 "raid_level": "raid1", 00:17:55.036 "base_bdevs": [ 00:17:55.036 "malloc1", 00:17:55.036 "malloc2", 00:17:55.036 "malloc3" 00:17:55.036 ], 00:17:55.036 "superblock": false, 00:17:55.036 "method": "bdev_raid_create", 00:17:55.036 "req_id": 1 00:17:55.036 } 00:17:55.036 Got JSON-RPC error response 00:17:55.036 response: 00:17:55.036 { 00:17:55.036 "code": -17, 00:17:55.036 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:55.036 } 00:17:55.036 10:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:55.037 10:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:55.037 10:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:55.037 10:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:55.037 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.037 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:55.295 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:55.295 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:55.295 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:55.553 [2024-07-15 10:25:32.649893] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:55.553 [2024-07-15 10:25:32.649955] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:55.553 [2024-07-15 10:25:32.649979] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfec7a0 00:17:55.553 [2024-07-15 10:25:32.649992] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:55.553 [2024-07-15 10:25:32.651696] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:55.553 [2024-07-15 10:25:32.651727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:55.553 [2024-07-15 10:25:32.651808] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:55.553 [2024-07-15 10:25:32.651837] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:55.553 pt1 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.553 10:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:56.119 10:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.119 "name": "raid_bdev1", 00:17:56.119 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:56.119 "strip_size_kb": 0, 00:17:56.119 "state": "configuring", 00:17:56.119 "raid_level": "raid1", 00:17:56.119 "superblock": true, 00:17:56.119 "num_base_bdevs": 3, 00:17:56.119 "num_base_bdevs_discovered": 1, 00:17:56.119 "num_base_bdevs_operational": 3, 00:17:56.119 "base_bdevs_list": [ 00:17:56.119 { 00:17:56.119 "name": "pt1", 00:17:56.119 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:56.119 "is_configured": true, 00:17:56.119 "data_offset": 2048, 00:17:56.119 "data_size": 63488 00:17:56.119 }, 00:17:56.119 { 00:17:56.119 "name": null, 00:17:56.119 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:56.119 "is_configured": false, 00:17:56.119 "data_offset": 2048, 00:17:56.119 "data_size": 63488 00:17:56.119 }, 00:17:56.119 { 00:17:56.119 "name": null, 00:17:56.119 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:56.119 "is_configured": false, 00:17:56.119 "data_offset": 2048, 00:17:56.119 "data_size": 63488 00:17:56.119 } 00:17:56.119 ] 00:17:56.119 }' 00:17:56.119 10:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.119 10:25:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.684 10:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:56.684 10:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:57.251 [2024-07-15 10:25:34.262254] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:57.251 [2024-07-15 10:25:34.262312] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:57.251 [2024-07-15 10:25:34.262332] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe3a10 00:17:57.251 [2024-07-15 10:25:34.262344] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:57.251 [2024-07-15 10:25:34.262709] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:57.251 [2024-07-15 10:25:34.262731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:57.251 [2024-07-15 10:25:34.262803] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:57.251 [2024-07-15 10:25:34.262826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:57.251 pt2 00:17:57.251 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:57.509 [2024-07-15 10:25:34.518985] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.509 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:57.767 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.767 "name": "raid_bdev1", 00:17:57.767 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:57.767 "strip_size_kb": 0, 00:17:57.767 "state": "configuring", 00:17:57.767 "raid_level": "raid1", 00:17:57.767 "superblock": true, 00:17:57.767 "num_base_bdevs": 3, 00:17:57.767 "num_base_bdevs_discovered": 1, 00:17:57.767 "num_base_bdevs_operational": 3, 00:17:57.767 "base_bdevs_list": [ 00:17:57.767 { 00:17:57.767 "name": "pt1", 00:17:57.767 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:57.767 "is_configured": true, 00:17:57.767 "data_offset": 2048, 00:17:57.767 "data_size": 63488 00:17:57.767 }, 00:17:57.767 { 00:17:57.767 "name": null, 00:17:57.767 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:57.767 "is_configured": false, 00:17:57.767 "data_offset": 2048, 00:17:57.767 "data_size": 63488 00:17:57.767 }, 00:17:57.767 { 00:17:57.767 "name": null, 00:17:57.767 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:57.767 "is_configured": false, 00:17:57.767 "data_offset": 2048, 00:17:57.767 "data_size": 63488 00:17:57.767 } 00:17:57.767 ] 00:17:57.767 }' 00:17:57.767 10:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.767 10:25:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.332 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:58.332 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:58.332 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:58.589 [2024-07-15 10:25:35.561722] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:58.589 [2024-07-15 10:25:35.561773] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.589 [2024-07-15 10:25:35.561795] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfeca10 00:17:58.589 [2024-07-15 10:25:35.561808] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.589 [2024-07-15 10:25:35.562158] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.589 [2024-07-15 10:25:35.562178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:58.589 [2024-07-15 10:25:35.562244] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:58.589 [2024-07-15 10:25:35.562263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:58.589 pt2 00:17:58.589 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:58.589 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:58.589 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:58.847 [2024-07-15 10:25:35.802358] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:58.847 [2024-07-15 10:25:35.802400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.847 [2024-07-15 10:25:35.802416] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe36c0 00:17:58.847 [2024-07-15 10:25:35.802429] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.847 [2024-07-15 10:25:35.802719] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.847 [2024-07-15 10:25:35.802737] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:58.847 [2024-07-15 10:25:35.802791] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:58.847 [2024-07-15 10:25:35.802809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:58.847 [2024-07-15 10:25:35.802915] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1186c00 00:17:58.847 [2024-07-15 10:25:35.802935] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:58.847 [2024-07-15 10:25:35.803100] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfe6610 00:17:58.847 [2024-07-15 10:25:35.803231] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1186c00 00:17:58.847 [2024-07-15 10:25:35.803241] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1186c00 00:17:58.847 [2024-07-15 10:25:35.803340] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:58.847 pt3 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.847 10:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:59.105 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.105 "name": "raid_bdev1", 00:17:59.105 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:59.105 "strip_size_kb": 0, 00:17:59.105 "state": "online", 00:17:59.105 "raid_level": "raid1", 00:17:59.105 "superblock": true, 00:17:59.105 "num_base_bdevs": 3, 00:17:59.105 "num_base_bdevs_discovered": 3, 00:17:59.105 "num_base_bdevs_operational": 3, 00:17:59.105 "base_bdevs_list": [ 00:17:59.105 { 00:17:59.105 "name": "pt1", 00:17:59.105 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:59.105 "is_configured": true, 00:17:59.105 "data_offset": 2048, 00:17:59.105 "data_size": 63488 00:17:59.105 }, 00:17:59.105 { 00:17:59.105 "name": "pt2", 00:17:59.105 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:59.105 "is_configured": true, 00:17:59.105 "data_offset": 2048, 00:17:59.105 "data_size": 63488 00:17:59.105 }, 00:17:59.105 { 00:17:59.105 "name": "pt3", 00:17:59.105 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:59.105 "is_configured": true, 00:17:59.105 "data_offset": 2048, 00:17:59.105 "data_size": 63488 00:17:59.105 } 00:17:59.105 ] 00:17:59.105 }' 00:17:59.105 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.105 10:25:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:59.672 [2024-07-15 10:25:36.737130] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:59.672 "name": "raid_bdev1", 00:17:59.672 "aliases": [ 00:17:59.672 "2eb95585-1065-4088-8aaa-6be7901e002c" 00:17:59.672 ], 00:17:59.672 "product_name": "Raid Volume", 00:17:59.672 "block_size": 512, 00:17:59.672 "num_blocks": 63488, 00:17:59.672 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:59.672 "assigned_rate_limits": { 00:17:59.672 "rw_ios_per_sec": 0, 00:17:59.672 "rw_mbytes_per_sec": 0, 00:17:59.672 "r_mbytes_per_sec": 0, 00:17:59.672 "w_mbytes_per_sec": 0 00:17:59.672 }, 00:17:59.672 "claimed": false, 00:17:59.672 "zoned": false, 00:17:59.672 "supported_io_types": { 00:17:59.672 "read": true, 00:17:59.672 "write": true, 00:17:59.672 "unmap": false, 00:17:59.672 "flush": false, 00:17:59.672 "reset": true, 00:17:59.672 "nvme_admin": false, 00:17:59.672 "nvme_io": false, 00:17:59.672 "nvme_io_md": false, 00:17:59.672 "write_zeroes": true, 00:17:59.672 "zcopy": false, 00:17:59.672 "get_zone_info": false, 00:17:59.672 "zone_management": false, 00:17:59.672 "zone_append": false, 00:17:59.672 "compare": false, 00:17:59.672 "compare_and_write": false, 00:17:59.672 "abort": false, 00:17:59.672 "seek_hole": false, 00:17:59.672 "seek_data": false, 00:17:59.672 "copy": false, 00:17:59.672 "nvme_iov_md": false 00:17:59.672 }, 00:17:59.672 "memory_domains": [ 00:17:59.672 { 00:17:59.672 "dma_device_id": "system", 00:17:59.672 "dma_device_type": 1 00:17:59.672 }, 00:17:59.672 { 00:17:59.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.672 "dma_device_type": 2 00:17:59.672 }, 00:17:59.672 { 00:17:59.672 "dma_device_id": "system", 00:17:59.672 "dma_device_type": 1 00:17:59.672 }, 00:17:59.672 { 00:17:59.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.672 "dma_device_type": 2 00:17:59.672 }, 00:17:59.672 { 00:17:59.672 "dma_device_id": "system", 00:17:59.672 "dma_device_type": 1 00:17:59.672 }, 00:17:59.672 { 00:17:59.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.672 "dma_device_type": 2 00:17:59.672 } 00:17:59.672 ], 00:17:59.672 "driver_specific": { 00:17:59.672 "raid": { 00:17:59.672 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:17:59.672 "strip_size_kb": 0, 00:17:59.672 "state": "online", 00:17:59.672 "raid_level": "raid1", 00:17:59.672 "superblock": true, 00:17:59.672 "num_base_bdevs": 3, 00:17:59.672 "num_base_bdevs_discovered": 3, 00:17:59.672 "num_base_bdevs_operational": 3, 00:17:59.672 "base_bdevs_list": [ 00:17:59.672 { 00:17:59.672 "name": "pt1", 00:17:59.672 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:59.672 "is_configured": true, 00:17:59.672 "data_offset": 2048, 00:17:59.672 "data_size": 63488 00:17:59.672 }, 00:17:59.672 { 00:17:59.672 "name": "pt2", 00:17:59.672 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:59.672 "is_configured": true, 00:17:59.672 "data_offset": 2048, 00:17:59.672 "data_size": 63488 00:17:59.672 }, 00:17:59.672 { 00:17:59.672 "name": "pt3", 00:17:59.672 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:59.672 "is_configured": true, 00:17:59.672 "data_offset": 2048, 00:17:59.672 "data_size": 63488 00:17:59.672 } 00:17:59.672 ] 00:17:59.672 } 00:17:59.672 } 00:17:59.672 }' 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:59.672 pt2 00:17:59.672 pt3' 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:59.672 10:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.930 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.930 "name": "pt1", 00:17:59.930 "aliases": [ 00:17:59.930 "00000000-0000-0000-0000-000000000001" 00:17:59.930 ], 00:17:59.930 "product_name": "passthru", 00:17:59.930 "block_size": 512, 00:17:59.930 "num_blocks": 65536, 00:17:59.930 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:59.930 "assigned_rate_limits": { 00:17:59.930 "rw_ios_per_sec": 0, 00:17:59.930 "rw_mbytes_per_sec": 0, 00:17:59.930 "r_mbytes_per_sec": 0, 00:17:59.930 "w_mbytes_per_sec": 0 00:17:59.930 }, 00:17:59.930 "claimed": true, 00:17:59.930 "claim_type": "exclusive_write", 00:17:59.930 "zoned": false, 00:17:59.930 "supported_io_types": { 00:17:59.930 "read": true, 00:17:59.930 "write": true, 00:17:59.930 "unmap": true, 00:17:59.930 "flush": true, 00:17:59.930 "reset": true, 00:17:59.930 "nvme_admin": false, 00:17:59.930 "nvme_io": false, 00:17:59.930 "nvme_io_md": false, 00:17:59.930 "write_zeroes": true, 00:17:59.930 "zcopy": true, 00:17:59.930 "get_zone_info": false, 00:17:59.930 "zone_management": false, 00:17:59.930 "zone_append": false, 00:17:59.930 "compare": false, 00:17:59.930 "compare_and_write": false, 00:17:59.930 "abort": true, 00:17:59.931 "seek_hole": false, 00:17:59.931 "seek_data": false, 00:17:59.931 "copy": true, 00:17:59.931 "nvme_iov_md": false 00:17:59.931 }, 00:17:59.931 "memory_domains": [ 00:17:59.931 { 00:17:59.931 "dma_device_id": "system", 00:17:59.931 "dma_device_type": 1 00:17:59.931 }, 00:17:59.931 { 00:17:59.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.931 "dma_device_type": 2 00:17:59.931 } 00:17:59.931 ], 00:17:59.931 "driver_specific": { 00:17:59.931 "passthru": { 00:17:59.931 "name": "pt1", 00:17:59.931 "base_bdev_name": "malloc1" 00:17:59.931 } 00:17:59.931 } 00:17:59.931 }' 00:17:59.931 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.931 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.188 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.444 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.444 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:00.444 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:00.444 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:00.702 "name": "pt2", 00:18:00.702 "aliases": [ 00:18:00.702 "00000000-0000-0000-0000-000000000002" 00:18:00.702 ], 00:18:00.702 "product_name": "passthru", 00:18:00.702 "block_size": 512, 00:18:00.702 "num_blocks": 65536, 00:18:00.702 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:00.702 "assigned_rate_limits": { 00:18:00.702 "rw_ios_per_sec": 0, 00:18:00.702 "rw_mbytes_per_sec": 0, 00:18:00.702 "r_mbytes_per_sec": 0, 00:18:00.702 "w_mbytes_per_sec": 0 00:18:00.702 }, 00:18:00.702 "claimed": true, 00:18:00.702 "claim_type": "exclusive_write", 00:18:00.702 "zoned": false, 00:18:00.702 "supported_io_types": { 00:18:00.702 "read": true, 00:18:00.702 "write": true, 00:18:00.702 "unmap": true, 00:18:00.702 "flush": true, 00:18:00.702 "reset": true, 00:18:00.702 "nvme_admin": false, 00:18:00.702 "nvme_io": false, 00:18:00.702 "nvme_io_md": false, 00:18:00.702 "write_zeroes": true, 00:18:00.702 "zcopy": true, 00:18:00.702 "get_zone_info": false, 00:18:00.702 "zone_management": false, 00:18:00.702 "zone_append": false, 00:18:00.702 "compare": false, 00:18:00.702 "compare_and_write": false, 00:18:00.702 "abort": true, 00:18:00.702 "seek_hole": false, 00:18:00.702 "seek_data": false, 00:18:00.702 "copy": true, 00:18:00.702 "nvme_iov_md": false 00:18:00.702 }, 00:18:00.702 "memory_domains": [ 00:18:00.702 { 00:18:00.702 "dma_device_id": "system", 00:18:00.702 "dma_device_type": 1 00:18:00.702 }, 00:18:00.702 { 00:18:00.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.702 "dma_device_type": 2 00:18:00.702 } 00:18:00.702 ], 00:18:00.702 "driver_specific": { 00:18:00.702 "passthru": { 00:18:00.702 "name": "pt2", 00:18:00.702 "base_bdev_name": "malloc2" 00:18:00.702 } 00:18:00.702 } 00:18:00.702 }' 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.702 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.960 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.960 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.960 10:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.960 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.960 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:00.960 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:00.960 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:01.218 "name": "pt3", 00:18:01.218 "aliases": [ 00:18:01.218 "00000000-0000-0000-0000-000000000003" 00:18:01.218 ], 00:18:01.218 "product_name": "passthru", 00:18:01.218 "block_size": 512, 00:18:01.218 "num_blocks": 65536, 00:18:01.218 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:01.218 "assigned_rate_limits": { 00:18:01.218 "rw_ios_per_sec": 0, 00:18:01.218 "rw_mbytes_per_sec": 0, 00:18:01.218 "r_mbytes_per_sec": 0, 00:18:01.218 "w_mbytes_per_sec": 0 00:18:01.218 }, 00:18:01.218 "claimed": true, 00:18:01.218 "claim_type": "exclusive_write", 00:18:01.218 "zoned": false, 00:18:01.218 "supported_io_types": { 00:18:01.218 "read": true, 00:18:01.218 "write": true, 00:18:01.218 "unmap": true, 00:18:01.218 "flush": true, 00:18:01.218 "reset": true, 00:18:01.218 "nvme_admin": false, 00:18:01.218 "nvme_io": false, 00:18:01.218 "nvme_io_md": false, 00:18:01.218 "write_zeroes": true, 00:18:01.218 "zcopy": true, 00:18:01.218 "get_zone_info": false, 00:18:01.218 "zone_management": false, 00:18:01.218 "zone_append": false, 00:18:01.218 "compare": false, 00:18:01.218 "compare_and_write": false, 00:18:01.218 "abort": true, 00:18:01.218 "seek_hole": false, 00:18:01.218 "seek_data": false, 00:18:01.218 "copy": true, 00:18:01.218 "nvme_iov_md": false 00:18:01.218 }, 00:18:01.218 "memory_domains": [ 00:18:01.218 { 00:18:01.218 "dma_device_id": "system", 00:18:01.218 "dma_device_type": 1 00:18:01.218 }, 00:18:01.218 { 00:18:01.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.218 "dma_device_type": 2 00:18:01.218 } 00:18:01.218 ], 00:18:01.218 "driver_specific": { 00:18:01.218 "passthru": { 00:18:01.218 "name": "pt3", 00:18:01.218 "base_bdev_name": "malloc3" 00:18:01.218 } 00:18:01.218 } 00:18:01.218 }' 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:01.218 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.476 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:01.476 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:01.476 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:01.476 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:01.734 [2024-07-15 10:25:38.706352] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:01.734 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2eb95585-1065-4088-8aaa-6be7901e002c '!=' 2eb95585-1065-4088-8aaa-6be7901e002c ']' 00:18:01.734 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:18:01.734 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:01.734 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:01.734 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:01.991 [2024-07-15 10:25:38.954748] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.991 10:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:02.249 10:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:02.249 "name": "raid_bdev1", 00:18:02.249 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:18:02.249 "strip_size_kb": 0, 00:18:02.249 "state": "online", 00:18:02.249 "raid_level": "raid1", 00:18:02.249 "superblock": true, 00:18:02.249 "num_base_bdevs": 3, 00:18:02.249 "num_base_bdevs_discovered": 2, 00:18:02.249 "num_base_bdevs_operational": 2, 00:18:02.249 "base_bdevs_list": [ 00:18:02.249 { 00:18:02.249 "name": null, 00:18:02.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.249 "is_configured": false, 00:18:02.249 "data_offset": 2048, 00:18:02.249 "data_size": 63488 00:18:02.249 }, 00:18:02.249 { 00:18:02.249 "name": "pt2", 00:18:02.249 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:02.249 "is_configured": true, 00:18:02.249 "data_offset": 2048, 00:18:02.249 "data_size": 63488 00:18:02.249 }, 00:18:02.249 { 00:18:02.249 "name": "pt3", 00:18:02.249 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:02.249 "is_configured": true, 00:18:02.249 "data_offset": 2048, 00:18:02.249 "data_size": 63488 00:18:02.249 } 00:18:02.249 ] 00:18:02.249 }' 00:18:02.249 10:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:02.249 10:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.813 10:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:03.072 [2024-07-15 10:25:40.041619] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:03.072 [2024-07-15 10:25:40.041656] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:03.072 [2024-07-15 10:25:40.041719] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:03.072 [2024-07-15 10:25:40.041777] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:03.072 [2024-07-15 10:25:40.041792] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1186c00 name raid_bdev1, state offline 00:18:03.072 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.072 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:18:03.330 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:18:03.330 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:18:03.330 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:18:03.330 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:03.330 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:03.588 10:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:03.878 [2024-07-15 10:25:41.008123] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:03.878 [2024-07-15 10:25:41.008178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:03.878 [2024-07-15 10:25:41.008196] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe4310 00:18:03.878 [2024-07-15 10:25:41.008209] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:03.878 [2024-07-15 10:25:41.009897] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:03.878 [2024-07-15 10:25:41.009937] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:03.878 [2024-07-15 10:25:41.010019] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:03.878 [2024-07-15 10:25:41.010048] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:03.878 pt2 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.878 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:04.136 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.136 "name": "raid_bdev1", 00:18:04.136 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:18:04.136 "strip_size_kb": 0, 00:18:04.136 "state": "configuring", 00:18:04.136 "raid_level": "raid1", 00:18:04.136 "superblock": true, 00:18:04.136 "num_base_bdevs": 3, 00:18:04.136 "num_base_bdevs_discovered": 1, 00:18:04.136 "num_base_bdevs_operational": 2, 00:18:04.136 "base_bdevs_list": [ 00:18:04.136 { 00:18:04.136 "name": null, 00:18:04.136 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.136 "is_configured": false, 00:18:04.136 "data_offset": 2048, 00:18:04.136 "data_size": 63488 00:18:04.136 }, 00:18:04.136 { 00:18:04.136 "name": "pt2", 00:18:04.136 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:04.136 "is_configured": true, 00:18:04.136 "data_offset": 2048, 00:18:04.136 "data_size": 63488 00:18:04.136 }, 00:18:04.136 { 00:18:04.136 "name": null, 00:18:04.136 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:04.136 "is_configured": false, 00:18:04.136 "data_offset": 2048, 00:18:04.136 "data_size": 63488 00:18:04.136 } 00:18:04.136 ] 00:18:04.136 }' 00:18:04.136 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.136 10:25:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.699 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:04.699 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:04.699 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:18:04.699 10:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:04.957 [2024-07-15 10:25:42.107047] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:04.957 [2024-07-15 10:25:42.107105] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:04.957 [2024-07-15 10:25:42.107128] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe2ec0 00:18:04.957 [2024-07-15 10:25:42.107141] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:04.957 [2024-07-15 10:25:42.107498] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:04.957 [2024-07-15 10:25:42.107517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:04.957 [2024-07-15 10:25:42.107586] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:04.957 [2024-07-15 10:25:42.107606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:04.957 [2024-07-15 10:25:42.107711] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1184cc0 00:18:04.957 [2024-07-15 10:25:42.107722] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:04.957 [2024-07-15 10:25:42.107886] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11856d0 00:18:04.957 [2024-07-15 10:25:42.108031] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1184cc0 00:18:04.957 [2024-07-15 10:25:42.108041] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1184cc0 00:18:04.957 [2024-07-15 10:25:42.108144] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:04.957 pt3 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.957 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:05.214 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.214 "name": "raid_bdev1", 00:18:05.214 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:18:05.214 "strip_size_kb": 0, 00:18:05.214 "state": "online", 00:18:05.214 "raid_level": "raid1", 00:18:05.214 "superblock": true, 00:18:05.214 "num_base_bdevs": 3, 00:18:05.214 "num_base_bdevs_discovered": 2, 00:18:05.214 "num_base_bdevs_operational": 2, 00:18:05.214 "base_bdevs_list": [ 00:18:05.214 { 00:18:05.214 "name": null, 00:18:05.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.214 "is_configured": false, 00:18:05.214 "data_offset": 2048, 00:18:05.214 "data_size": 63488 00:18:05.214 }, 00:18:05.214 { 00:18:05.214 "name": "pt2", 00:18:05.214 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:05.214 "is_configured": true, 00:18:05.214 "data_offset": 2048, 00:18:05.214 "data_size": 63488 00:18:05.214 }, 00:18:05.214 { 00:18:05.214 "name": "pt3", 00:18:05.214 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:05.214 "is_configured": true, 00:18:05.214 "data_offset": 2048, 00:18:05.214 "data_size": 63488 00:18:05.214 } 00:18:05.214 ] 00:18:05.214 }' 00:18:05.214 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.214 10:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.146 10:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:06.146 [2024-07-15 10:25:43.213984] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:06.146 [2024-07-15 10:25:43.214011] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:06.146 [2024-07-15 10:25:43.214068] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:06.146 [2024-07-15 10:25:43.214122] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:06.146 [2024-07-15 10:25:43.214134] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1184cc0 name raid_bdev1, state offline 00:18:06.146 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:18:06.146 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.403 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:18:06.403 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:18:06.403 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:18:06.403 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:18:06.403 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:06.660 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:06.917 [2024-07-15 10:25:43.947887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:06.917 [2024-07-15 10:25:43.947945] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:06.917 [2024-07-15 10:25:43.947964] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe2ec0 00:18:06.917 [2024-07-15 10:25:43.947977] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:06.917 [2024-07-15 10:25:43.949609] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:06.917 [2024-07-15 10:25:43.949641] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:06.917 [2024-07-15 10:25:43.949709] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:06.917 [2024-07-15 10:25:43.949737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:06.917 [2024-07-15 10:25:43.949834] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:06.917 [2024-07-15 10:25:43.949847] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:06.917 [2024-07-15 10:25:43.949861] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1184f40 name raid_bdev1, state configuring 00:18:06.917 [2024-07-15 10:25:43.949884] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:06.917 pt1 00:18:06.917 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:18:06.917 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:06.917 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:06.917 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.917 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:06.917 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:06.918 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:06.918 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.918 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.918 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.918 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.918 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.918 10:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:07.176 10:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:07.176 "name": "raid_bdev1", 00:18:07.176 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:18:07.176 "strip_size_kb": 0, 00:18:07.176 "state": "configuring", 00:18:07.176 "raid_level": "raid1", 00:18:07.176 "superblock": true, 00:18:07.176 "num_base_bdevs": 3, 00:18:07.176 "num_base_bdevs_discovered": 1, 00:18:07.176 "num_base_bdevs_operational": 2, 00:18:07.176 "base_bdevs_list": [ 00:18:07.176 { 00:18:07.176 "name": null, 00:18:07.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:07.176 "is_configured": false, 00:18:07.176 "data_offset": 2048, 00:18:07.176 "data_size": 63488 00:18:07.176 }, 00:18:07.176 { 00:18:07.176 "name": "pt2", 00:18:07.176 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:07.176 "is_configured": true, 00:18:07.176 "data_offset": 2048, 00:18:07.176 "data_size": 63488 00:18:07.176 }, 00:18:07.176 { 00:18:07.176 "name": null, 00:18:07.176 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:07.176 "is_configured": false, 00:18:07.176 "data_offset": 2048, 00:18:07.176 "data_size": 63488 00:18:07.176 } 00:18:07.176 ] 00:18:07.176 }' 00:18:07.176 10:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:07.176 10:25:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.742 10:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:07.742 10:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:08.000 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:18:08.000 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:08.257 [2024-07-15 10:25:45.287453] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:08.257 [2024-07-15 10:25:45.287510] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:08.257 [2024-07-15 10:25:45.287533] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe60c0 00:18:08.257 [2024-07-15 10:25:45.287546] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:08.257 [2024-07-15 10:25:45.287902] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:08.257 [2024-07-15 10:25:45.287920] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:08.257 [2024-07-15 10:25:45.288002] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:08.257 [2024-07-15 10:25:45.288023] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:08.257 [2024-07-15 10:25:45.288127] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfe6a40 00:18:08.257 [2024-07-15 10:25:45.288138] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:08.257 [2024-07-15 10:25:45.288307] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11856c0 00:18:08.257 [2024-07-15 10:25:45.288436] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfe6a40 00:18:08.257 [2024-07-15 10:25:45.288452] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfe6a40 00:18:08.257 [2024-07-15 10:25:45.288551] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:08.257 pt3 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.257 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:08.515 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.515 "name": "raid_bdev1", 00:18:08.515 "uuid": "2eb95585-1065-4088-8aaa-6be7901e002c", 00:18:08.515 "strip_size_kb": 0, 00:18:08.515 "state": "online", 00:18:08.515 "raid_level": "raid1", 00:18:08.515 "superblock": true, 00:18:08.515 "num_base_bdevs": 3, 00:18:08.515 "num_base_bdevs_discovered": 2, 00:18:08.515 "num_base_bdevs_operational": 2, 00:18:08.515 "base_bdevs_list": [ 00:18:08.515 { 00:18:08.515 "name": null, 00:18:08.515 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.515 "is_configured": false, 00:18:08.515 "data_offset": 2048, 00:18:08.515 "data_size": 63488 00:18:08.515 }, 00:18:08.515 { 00:18:08.515 "name": "pt2", 00:18:08.515 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:08.515 "is_configured": true, 00:18:08.515 "data_offset": 2048, 00:18:08.515 "data_size": 63488 00:18:08.515 }, 00:18:08.515 { 00:18:08.515 "name": "pt3", 00:18:08.515 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:08.515 "is_configured": true, 00:18:08.515 "data_offset": 2048, 00:18:08.515 "data_size": 63488 00:18:08.515 } 00:18:08.515 ] 00:18:08.515 }' 00:18:08.515 10:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.515 10:25:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.080 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:09.080 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:09.337 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:18:09.337 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:09.337 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:18:09.594 [2024-07-15 10:25:46.603203] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2eb95585-1065-4088-8aaa-6be7901e002c '!=' 2eb95585-1065-4088-8aaa-6be7901e002c ']' 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 529921 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 529921 ']' 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 529921 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 529921 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 529921' 00:18:09.594 killing process with pid 529921 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 529921 00:18:09.594 [2024-07-15 10:25:46.674832] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:09.594 [2024-07-15 10:25:46.674892] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:09.594 [2024-07-15 10:25:46.674962] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:09.594 [2024-07-15 10:25:46.674975] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfe6a40 name raid_bdev1, state offline 00:18:09.594 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 529921 00:18:09.594 [2024-07-15 10:25:46.705663] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:09.852 10:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:09.852 00:18:09.852 real 0m22.692s 00:18:09.852 user 0m41.473s 00:18:09.852 sys 0m4.087s 00:18:09.852 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:09.852 10:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.852 ************************************ 00:18:09.852 END TEST raid_superblock_test 00:18:09.852 ************************************ 00:18:09.852 10:25:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:09.852 10:25:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:18:09.852 10:25:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:09.852 10:25:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:09.852 10:25:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:09.852 ************************************ 00:18:09.852 START TEST raid_read_error_test 00:18:09.853 ************************************ 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZR5qkmaQri 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=533354 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 533354 /var/tmp/spdk-raid.sock 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 533354 ']' 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:09.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:09.853 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:10.111 [2024-07-15 10:25:47.084280] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:18:10.111 [2024-07-15 10:25:47.084346] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid533354 ] 00:18:10.111 [2024-07-15 10:25:47.212476] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:10.111 [2024-07-15 10:25:47.308884] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:10.369 [2024-07-15 10:25:47.363757] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:10.369 [2024-07-15 10:25:47.363808] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:10.934 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:10.934 10:25:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:10.934 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:10.934 10:25:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:11.191 BaseBdev1_malloc 00:18:11.191 10:25:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:11.449 true 00:18:11.449 10:25:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:11.707 [2024-07-15 10:25:48.651787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:11.707 [2024-07-15 10:25:48.651834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:11.707 [2024-07-15 10:25:48.651854] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ea80d0 00:18:11.707 [2024-07-15 10:25:48.651866] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:11.707 [2024-07-15 10:25:48.653751] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:11.707 [2024-07-15 10:25:48.653782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:11.707 BaseBdev1 00:18:11.707 10:25:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:11.707 10:25:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:11.707 BaseBdev2_malloc 00:18:11.965 10:25:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:11.965 true 00:18:11.965 10:25:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:12.223 [2024-07-15 10:25:49.374245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:12.223 [2024-07-15 10:25:49.374291] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:12.223 [2024-07-15 10:25:49.374311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eac910 00:18:12.223 [2024-07-15 10:25:49.374324] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:12.223 [2024-07-15 10:25:49.375886] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:12.223 [2024-07-15 10:25:49.375916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:12.223 BaseBdev2 00:18:12.223 10:25:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:12.223 10:25:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:12.482 BaseBdev3_malloc 00:18:12.482 10:25:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:12.739 true 00:18:12.739 10:25:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:12.997 [2024-07-15 10:25:50.117822] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:12.997 [2024-07-15 10:25:50.117871] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:12.997 [2024-07-15 10:25:50.117892] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1eaebd0 00:18:12.997 [2024-07-15 10:25:50.117904] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:12.997 [2024-07-15 10:25:50.119526] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:12.997 [2024-07-15 10:25:50.119556] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:12.997 BaseBdev3 00:18:12.997 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:13.255 [2024-07-15 10:25:50.354481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:13.255 [2024-07-15 10:25:50.355829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:13.255 [2024-07-15 10:25:50.355898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:13.255 [2024-07-15 10:25:50.356116] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1eb0280 00:18:13.255 [2024-07-15 10:25:50.356128] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:13.255 [2024-07-15 10:25:50.356331] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eafe20 00:18:13.255 [2024-07-15 10:25:50.356485] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1eb0280 00:18:13.255 [2024-07-15 10:25:50.356495] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1eb0280 00:18:13.255 [2024-07-15 10:25:50.356603] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.255 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:13.513 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.513 "name": "raid_bdev1", 00:18:13.513 "uuid": "d95b4bbc-6841-4b7e-9ce3-5008cebb2b53", 00:18:13.513 "strip_size_kb": 0, 00:18:13.513 "state": "online", 00:18:13.513 "raid_level": "raid1", 00:18:13.513 "superblock": true, 00:18:13.513 "num_base_bdevs": 3, 00:18:13.513 "num_base_bdevs_discovered": 3, 00:18:13.513 "num_base_bdevs_operational": 3, 00:18:13.513 "base_bdevs_list": [ 00:18:13.513 { 00:18:13.513 "name": "BaseBdev1", 00:18:13.513 "uuid": "2781d93d-4f50-5923-966d-70f7baef2503", 00:18:13.513 "is_configured": true, 00:18:13.513 "data_offset": 2048, 00:18:13.513 "data_size": 63488 00:18:13.513 }, 00:18:13.513 { 00:18:13.513 "name": "BaseBdev2", 00:18:13.513 "uuid": "1cad9f6b-8bb2-511e-a5f0-5d3b0fcc076b", 00:18:13.513 "is_configured": true, 00:18:13.513 "data_offset": 2048, 00:18:13.513 "data_size": 63488 00:18:13.513 }, 00:18:13.513 { 00:18:13.513 "name": "BaseBdev3", 00:18:13.513 "uuid": "5c52d09e-12ce-579a-a655-f54826091ba5", 00:18:13.513 "is_configured": true, 00:18:13.513 "data_offset": 2048, 00:18:13.513 "data_size": 63488 00:18:13.513 } 00:18:13.513 ] 00:18:13.513 }' 00:18:13.513 10:25:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.513 10:25:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.079 10:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:14.079 10:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:14.337 [2024-07-15 10:25:51.301273] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfde00 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.272 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:15.530 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.530 "name": "raid_bdev1", 00:18:15.530 "uuid": "d95b4bbc-6841-4b7e-9ce3-5008cebb2b53", 00:18:15.530 "strip_size_kb": 0, 00:18:15.530 "state": "online", 00:18:15.530 "raid_level": "raid1", 00:18:15.530 "superblock": true, 00:18:15.530 "num_base_bdevs": 3, 00:18:15.530 "num_base_bdevs_discovered": 3, 00:18:15.530 "num_base_bdevs_operational": 3, 00:18:15.530 "base_bdevs_list": [ 00:18:15.530 { 00:18:15.530 "name": "BaseBdev1", 00:18:15.530 "uuid": "2781d93d-4f50-5923-966d-70f7baef2503", 00:18:15.530 "is_configured": true, 00:18:15.530 "data_offset": 2048, 00:18:15.530 "data_size": 63488 00:18:15.530 }, 00:18:15.530 { 00:18:15.530 "name": "BaseBdev2", 00:18:15.530 "uuid": "1cad9f6b-8bb2-511e-a5f0-5d3b0fcc076b", 00:18:15.530 "is_configured": true, 00:18:15.530 "data_offset": 2048, 00:18:15.530 "data_size": 63488 00:18:15.530 }, 00:18:15.530 { 00:18:15.530 "name": "BaseBdev3", 00:18:15.530 "uuid": "5c52d09e-12ce-579a-a655-f54826091ba5", 00:18:15.530 "is_configured": true, 00:18:15.530 "data_offset": 2048, 00:18:15.530 "data_size": 63488 00:18:15.530 } 00:18:15.530 ] 00:18:15.530 }' 00:18:15.530 10:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.530 10:25:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:16.463 [2024-07-15 10:25:53.541287] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:16.463 [2024-07-15 10:25:53.541322] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:16.463 [2024-07-15 10:25:53.544523] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:16.463 [2024-07-15 10:25:53.544556] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:16.463 [2024-07-15 10:25:53.544654] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:16.463 [2024-07-15 10:25:53.544667] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1eb0280 name raid_bdev1, state offline 00:18:16.463 0 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 533354 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 533354 ']' 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 533354 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 533354 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 533354' 00:18:16.463 killing process with pid 533354 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 533354 00:18:16.463 [2024-07-15 10:25:53.611525] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:16.463 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 533354 00:18:16.463 [2024-07-15 10:25:53.632640] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZR5qkmaQri 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:16.721 00:18:16.721 real 0m6.854s 00:18:16.721 user 0m10.832s 00:18:16.721 sys 0m1.212s 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:16.721 10:25:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.721 ************************************ 00:18:16.721 END TEST raid_read_error_test 00:18:16.721 ************************************ 00:18:16.980 10:25:53 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:16.980 10:25:53 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:18:16.980 10:25:53 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:16.980 10:25:53 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:16.980 10:25:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:16.980 ************************************ 00:18:16.980 START TEST raid_write_error_test 00:18:16.980 ************************************ 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.M6ZrQTOfAZ 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=534335 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 534335 /var/tmp/spdk-raid.sock 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 534335 ']' 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:16.980 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:16.980 10:25:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.980 [2024-07-15 10:25:54.043420] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:18:16.980 [2024-07-15 10:25:54.043492] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid534335 ] 00:18:16.980 [2024-07-15 10:25:54.172410] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:17.238 [2024-07-15 10:25:54.275758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:17.238 [2024-07-15 10:25:54.335010] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:17.238 [2024-07-15 10:25:54.335043] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:17.804 10:25:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:17.804 10:25:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:17.804 10:25:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:17.804 10:25:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:18.062 BaseBdev1_malloc 00:18:18.062 10:25:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:18.320 true 00:18:18.320 10:25:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:18.578 [2024-07-15 10:25:55.688151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:18.578 [2024-07-15 10:25:55.688197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:18.578 [2024-07-15 10:25:55.688218] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20e80d0 00:18:18.578 [2024-07-15 10:25:55.688231] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:18.578 [2024-07-15 10:25:55.690045] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:18.578 [2024-07-15 10:25:55.690078] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:18.578 BaseBdev1 00:18:18.578 10:25:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:18.578 10:25:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:18.837 BaseBdev2_malloc 00:18:18.837 10:25:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:19.094 true 00:18:19.094 10:25:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:19.352 [2024-07-15 10:25:56.362548] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:19.352 [2024-07-15 10:25:56.362593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.352 [2024-07-15 10:25:56.362613] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20ec910 00:18:19.352 [2024-07-15 10:25:56.362631] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.352 [2024-07-15 10:25:56.364058] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.352 [2024-07-15 10:25:56.364087] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:19.352 BaseBdev2 00:18:19.352 10:25:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:19.352 10:25:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:19.352 BaseBdev3_malloc 00:18:19.610 10:25:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:19.610 true 00:18:19.610 10:25:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:19.868 [2024-07-15 10:25:57.020842] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:19.868 [2024-07-15 10:25:57.020888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:19.868 [2024-07-15 10:25:57.020907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20eebd0 00:18:19.868 [2024-07-15 10:25:57.020920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:19.868 [2024-07-15 10:25:57.022317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:19.868 [2024-07-15 10:25:57.022345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:19.868 BaseBdev3 00:18:19.868 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:18:20.126 [2024-07-15 10:25:57.261510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:20.126 [2024-07-15 10:25:57.262796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:20.126 [2024-07-15 10:25:57.262863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:20.126 [2024-07-15 10:25:57.263080] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f0280 00:18:20.126 [2024-07-15 10:25:57.263092] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:20.126 [2024-07-15 10:25:57.263286] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20efe20 00:18:20.126 [2024-07-15 10:25:57.263437] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f0280 00:18:20.126 [2024-07-15 10:25:57.263447] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f0280 00:18:20.126 [2024-07-15 10:25:57.263550] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.126 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:20.384 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.384 "name": "raid_bdev1", 00:18:20.384 "uuid": "5f7b67d7-636e-4114-9188-c949958c33c6", 00:18:20.384 "strip_size_kb": 0, 00:18:20.384 "state": "online", 00:18:20.384 "raid_level": "raid1", 00:18:20.384 "superblock": true, 00:18:20.384 "num_base_bdevs": 3, 00:18:20.384 "num_base_bdevs_discovered": 3, 00:18:20.384 "num_base_bdevs_operational": 3, 00:18:20.384 "base_bdevs_list": [ 00:18:20.384 { 00:18:20.384 "name": "BaseBdev1", 00:18:20.384 "uuid": "1bbd751b-62cb-5e1e-8c8d-8cb072690134", 00:18:20.384 "is_configured": true, 00:18:20.384 "data_offset": 2048, 00:18:20.384 "data_size": 63488 00:18:20.384 }, 00:18:20.384 { 00:18:20.384 "name": "BaseBdev2", 00:18:20.384 "uuid": "4a30176f-a2ca-55fc-bdc6-c774e6867d4d", 00:18:20.384 "is_configured": true, 00:18:20.384 "data_offset": 2048, 00:18:20.384 "data_size": 63488 00:18:20.384 }, 00:18:20.384 { 00:18:20.384 "name": "BaseBdev3", 00:18:20.384 "uuid": "b24b3a65-bc2c-58e7-a04a-d32f798698a9", 00:18:20.384 "is_configured": true, 00:18:20.384 "data_offset": 2048, 00:18:20.384 "data_size": 63488 00:18:20.384 } 00:18:20.384 ] 00:18:20.384 }' 00:18:20.384 10:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.384 10:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.990 10:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:18:20.990 10:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:20.990 [2024-07-15 10:25:58.083980] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f3de00 00:18:21.924 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:18:22.181 [2024-07-15 10:25:59.236700] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:18:22.181 [2024-07-15 10:25:59.236760] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:22.181 [2024-07-15 10:25:59.236963] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1f3de00 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:22.181 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.438 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.438 "name": "raid_bdev1", 00:18:22.438 "uuid": "5f7b67d7-636e-4114-9188-c949958c33c6", 00:18:22.438 "strip_size_kb": 0, 00:18:22.438 "state": "online", 00:18:22.438 "raid_level": "raid1", 00:18:22.438 "superblock": true, 00:18:22.438 "num_base_bdevs": 3, 00:18:22.438 "num_base_bdevs_discovered": 2, 00:18:22.438 "num_base_bdevs_operational": 2, 00:18:22.438 "base_bdevs_list": [ 00:18:22.438 { 00:18:22.438 "name": null, 00:18:22.438 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.438 "is_configured": false, 00:18:22.438 "data_offset": 2048, 00:18:22.438 "data_size": 63488 00:18:22.438 }, 00:18:22.438 { 00:18:22.438 "name": "BaseBdev2", 00:18:22.438 "uuid": "4a30176f-a2ca-55fc-bdc6-c774e6867d4d", 00:18:22.438 "is_configured": true, 00:18:22.438 "data_offset": 2048, 00:18:22.438 "data_size": 63488 00:18:22.438 }, 00:18:22.438 { 00:18:22.438 "name": "BaseBdev3", 00:18:22.438 "uuid": "b24b3a65-bc2c-58e7-a04a-d32f798698a9", 00:18:22.438 "is_configured": true, 00:18:22.438 "data_offset": 2048, 00:18:22.438 "data_size": 63488 00:18:22.438 } 00:18:22.438 ] 00:18:22.438 }' 00:18:22.438 10:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.438 10:25:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.003 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:23.261 [2024-07-15 10:26:00.356791] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:23.261 [2024-07-15 10:26:00.356836] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:23.261 [2024-07-15 10:26:00.360043] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:23.261 [2024-07-15 10:26:00.360076] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:23.261 [2024-07-15 10:26:00.360152] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:23.261 [2024-07-15 10:26:00.360164] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f0280 name raid_bdev1, state offline 00:18:23.261 0 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 534335 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 534335 ']' 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 534335 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 534335 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 534335' 00:18:23.261 killing process with pid 534335 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 534335 00:18:23.261 [2024-07-15 10:26:00.425433] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:23.261 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 534335 00:18:23.261 [2024-07-15 10:26:00.445617] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.M6ZrQTOfAZ 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:18:23.519 00:18:23.519 real 0m6.720s 00:18:23.519 user 0m10.518s 00:18:23.519 sys 0m1.195s 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:23.519 10:26:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.519 ************************************ 00:18:23.520 END TEST raid_write_error_test 00:18:23.520 ************************************ 00:18:23.777 10:26:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:23.777 10:26:00 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:18:23.778 10:26:00 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:18:23.778 10:26:00 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:18:23.778 10:26:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:23.778 10:26:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:23.778 10:26:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:23.778 ************************************ 00:18:23.778 START TEST raid_state_function_test 00:18:23.778 ************************************ 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=535316 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 535316' 00:18:23.778 Process raid pid: 535316 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 535316 /var/tmp/spdk-raid.sock 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 535316 ']' 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:23.778 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:23.778 10:26:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.778 [2024-07-15 10:26:00.829957] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:18:23.778 [2024-07-15 10:26:00.830022] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:23.778 [2024-07-15 10:26:00.956918] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:24.035 [2024-07-15 10:26:01.062315] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:24.035 [2024-07-15 10:26:01.117815] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:24.035 [2024-07-15 10:26:01.117845] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:24.292 10:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:24.292 10:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:18:24.292 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:24.549 [2024-07-15 10:26:01.528279] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:24.549 [2024-07-15 10:26:01.528323] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:24.549 [2024-07-15 10:26:01.528334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:24.549 [2024-07-15 10:26:01.528346] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:24.550 [2024-07-15 10:26:01.528355] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:24.550 [2024-07-15 10:26:01.528366] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:24.550 [2024-07-15 10:26:01.528374] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:24.550 [2024-07-15 10:26:01.528385] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.550 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.807 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.807 "name": "Existed_Raid", 00:18:24.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.807 "strip_size_kb": 64, 00:18:24.807 "state": "configuring", 00:18:24.807 "raid_level": "raid0", 00:18:24.807 "superblock": false, 00:18:24.807 "num_base_bdevs": 4, 00:18:24.807 "num_base_bdevs_discovered": 0, 00:18:24.807 "num_base_bdevs_operational": 4, 00:18:24.807 "base_bdevs_list": [ 00:18:24.807 { 00:18:24.807 "name": "BaseBdev1", 00:18:24.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.807 "is_configured": false, 00:18:24.807 "data_offset": 0, 00:18:24.807 "data_size": 0 00:18:24.807 }, 00:18:24.807 { 00:18:24.807 "name": "BaseBdev2", 00:18:24.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.807 "is_configured": false, 00:18:24.807 "data_offset": 0, 00:18:24.807 "data_size": 0 00:18:24.807 }, 00:18:24.807 { 00:18:24.807 "name": "BaseBdev3", 00:18:24.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.807 "is_configured": false, 00:18:24.807 "data_offset": 0, 00:18:24.807 "data_size": 0 00:18:24.807 }, 00:18:24.807 { 00:18:24.807 "name": "BaseBdev4", 00:18:24.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.807 "is_configured": false, 00:18:24.807 "data_offset": 0, 00:18:24.807 "data_size": 0 00:18:24.807 } 00:18:24.807 ] 00:18:24.807 }' 00:18:24.807 10:26:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.807 10:26:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.371 10:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:25.629 [2024-07-15 10:26:02.619031] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:25.629 [2024-07-15 10:26:02.619059] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa6baa0 name Existed_Raid, state configuring 00:18:25.629 10:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:25.887 [2024-07-15 10:26:02.863697] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:25.887 [2024-07-15 10:26:02.863728] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:25.887 [2024-07-15 10:26:02.863737] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:25.887 [2024-07-15 10:26:02.863749] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:25.887 [2024-07-15 10:26:02.863757] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:25.887 [2024-07-15 10:26:02.863768] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:25.887 [2024-07-15 10:26:02.863777] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:25.887 [2024-07-15 10:26:02.863788] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:25.887 10:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:26.144 [2024-07-15 10:26:03.106188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:26.144 BaseBdev1 00:18:26.144 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:26.144 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:26.144 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:26.144 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:26.144 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:26.144 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:26.144 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:26.401 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:26.658 [ 00:18:26.658 { 00:18:26.658 "name": "BaseBdev1", 00:18:26.658 "aliases": [ 00:18:26.658 "e155c356-ca60-4695-bfbd-e40574856b2a" 00:18:26.658 ], 00:18:26.658 "product_name": "Malloc disk", 00:18:26.658 "block_size": 512, 00:18:26.658 "num_blocks": 65536, 00:18:26.658 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:26.658 "assigned_rate_limits": { 00:18:26.658 "rw_ios_per_sec": 0, 00:18:26.658 "rw_mbytes_per_sec": 0, 00:18:26.658 "r_mbytes_per_sec": 0, 00:18:26.658 "w_mbytes_per_sec": 0 00:18:26.658 }, 00:18:26.658 "claimed": true, 00:18:26.658 "claim_type": "exclusive_write", 00:18:26.658 "zoned": false, 00:18:26.658 "supported_io_types": { 00:18:26.658 "read": true, 00:18:26.658 "write": true, 00:18:26.658 "unmap": true, 00:18:26.658 "flush": true, 00:18:26.658 "reset": true, 00:18:26.658 "nvme_admin": false, 00:18:26.658 "nvme_io": false, 00:18:26.658 "nvme_io_md": false, 00:18:26.658 "write_zeroes": true, 00:18:26.658 "zcopy": true, 00:18:26.658 "get_zone_info": false, 00:18:26.658 "zone_management": false, 00:18:26.658 "zone_append": false, 00:18:26.658 "compare": false, 00:18:26.658 "compare_and_write": false, 00:18:26.658 "abort": true, 00:18:26.658 "seek_hole": false, 00:18:26.658 "seek_data": false, 00:18:26.658 "copy": true, 00:18:26.658 "nvme_iov_md": false 00:18:26.658 }, 00:18:26.658 "memory_domains": [ 00:18:26.658 { 00:18:26.658 "dma_device_id": "system", 00:18:26.658 "dma_device_type": 1 00:18:26.658 }, 00:18:26.658 { 00:18:26.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:26.658 "dma_device_type": 2 00:18:26.658 } 00:18:26.658 ], 00:18:26.658 "driver_specific": {} 00:18:26.658 } 00:18:26.658 ] 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.658 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.659 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.659 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.659 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.659 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.916 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.916 "name": "Existed_Raid", 00:18:26.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.916 "strip_size_kb": 64, 00:18:26.916 "state": "configuring", 00:18:26.916 "raid_level": "raid0", 00:18:26.916 "superblock": false, 00:18:26.916 "num_base_bdevs": 4, 00:18:26.916 "num_base_bdevs_discovered": 1, 00:18:26.916 "num_base_bdevs_operational": 4, 00:18:26.916 "base_bdevs_list": [ 00:18:26.916 { 00:18:26.916 "name": "BaseBdev1", 00:18:26.916 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:26.916 "is_configured": true, 00:18:26.916 "data_offset": 0, 00:18:26.916 "data_size": 65536 00:18:26.916 }, 00:18:26.916 { 00:18:26.916 "name": "BaseBdev2", 00:18:26.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.916 "is_configured": false, 00:18:26.916 "data_offset": 0, 00:18:26.916 "data_size": 0 00:18:26.916 }, 00:18:26.916 { 00:18:26.916 "name": "BaseBdev3", 00:18:26.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.916 "is_configured": false, 00:18:26.916 "data_offset": 0, 00:18:26.916 "data_size": 0 00:18:26.916 }, 00:18:26.916 { 00:18:26.916 "name": "BaseBdev4", 00:18:26.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.916 "is_configured": false, 00:18:26.916 "data_offset": 0, 00:18:26.916 "data_size": 0 00:18:26.917 } 00:18:26.917 ] 00:18:26.917 }' 00:18:26.917 10:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.917 10:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:27.483 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:27.741 [2024-07-15 10:26:04.690363] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:27.741 [2024-07-15 10:26:04.690405] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa6b310 name Existed_Raid, state configuring 00:18:27.741 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:27.741 [2024-07-15 10:26:04.931048] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:27.741 [2024-07-15 10:26:04.932505] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:27.741 [2024-07-15 10:26:04.932537] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:27.741 [2024-07-15 10:26:04.932547] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:27.741 [2024-07-15 10:26:04.932558] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:27.741 [2024-07-15 10:26:04.932567] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:27.741 [2024-07-15 10:26:04.932578] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.999 10:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.999 10:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.999 "name": "Existed_Raid", 00:18:27.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.999 "strip_size_kb": 64, 00:18:27.999 "state": "configuring", 00:18:27.999 "raid_level": "raid0", 00:18:27.999 "superblock": false, 00:18:27.999 "num_base_bdevs": 4, 00:18:27.999 "num_base_bdevs_discovered": 1, 00:18:27.999 "num_base_bdevs_operational": 4, 00:18:27.999 "base_bdevs_list": [ 00:18:27.999 { 00:18:27.999 "name": "BaseBdev1", 00:18:27.999 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:27.999 "is_configured": true, 00:18:27.999 "data_offset": 0, 00:18:27.999 "data_size": 65536 00:18:27.999 }, 00:18:27.999 { 00:18:27.999 "name": "BaseBdev2", 00:18:27.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.999 "is_configured": false, 00:18:27.999 "data_offset": 0, 00:18:27.999 "data_size": 0 00:18:27.999 }, 00:18:27.999 { 00:18:27.999 "name": "BaseBdev3", 00:18:27.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.999 "is_configured": false, 00:18:27.999 "data_offset": 0, 00:18:27.999 "data_size": 0 00:18:27.999 }, 00:18:27.999 { 00:18:27.999 "name": "BaseBdev4", 00:18:27.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:27.999 "is_configured": false, 00:18:27.999 "data_offset": 0, 00:18:27.999 "data_size": 0 00:18:27.999 } 00:18:27.999 ] 00:18:27.999 }' 00:18:27.999 10:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.999 10:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.932 10:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:28.932 [2024-07-15 10:26:06.029384] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:28.932 BaseBdev2 00:18:28.932 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:28.932 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:28.932 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:28.932 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:28.932 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:28.932 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:28.932 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:29.190 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:29.448 [ 00:18:29.448 { 00:18:29.449 "name": "BaseBdev2", 00:18:29.449 "aliases": [ 00:18:29.449 "249ad702-9f3c-497d-b87a-4949d6f19bdd" 00:18:29.449 ], 00:18:29.449 "product_name": "Malloc disk", 00:18:29.449 "block_size": 512, 00:18:29.449 "num_blocks": 65536, 00:18:29.449 "uuid": "249ad702-9f3c-497d-b87a-4949d6f19bdd", 00:18:29.449 "assigned_rate_limits": { 00:18:29.449 "rw_ios_per_sec": 0, 00:18:29.449 "rw_mbytes_per_sec": 0, 00:18:29.449 "r_mbytes_per_sec": 0, 00:18:29.449 "w_mbytes_per_sec": 0 00:18:29.449 }, 00:18:29.449 "claimed": true, 00:18:29.449 "claim_type": "exclusive_write", 00:18:29.449 "zoned": false, 00:18:29.449 "supported_io_types": { 00:18:29.449 "read": true, 00:18:29.449 "write": true, 00:18:29.449 "unmap": true, 00:18:29.449 "flush": true, 00:18:29.449 "reset": true, 00:18:29.449 "nvme_admin": false, 00:18:29.449 "nvme_io": false, 00:18:29.449 "nvme_io_md": false, 00:18:29.449 "write_zeroes": true, 00:18:29.449 "zcopy": true, 00:18:29.449 "get_zone_info": false, 00:18:29.449 "zone_management": false, 00:18:29.449 "zone_append": false, 00:18:29.449 "compare": false, 00:18:29.449 "compare_and_write": false, 00:18:29.449 "abort": true, 00:18:29.449 "seek_hole": false, 00:18:29.449 "seek_data": false, 00:18:29.449 "copy": true, 00:18:29.449 "nvme_iov_md": false 00:18:29.449 }, 00:18:29.449 "memory_domains": [ 00:18:29.449 { 00:18:29.449 "dma_device_id": "system", 00:18:29.449 "dma_device_type": 1 00:18:29.449 }, 00:18:29.449 { 00:18:29.449 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.449 "dma_device_type": 2 00:18:29.449 } 00:18:29.449 ], 00:18:29.449 "driver_specific": {} 00:18:29.449 } 00:18:29.449 ] 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.449 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.707 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.707 "name": "Existed_Raid", 00:18:29.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.707 "strip_size_kb": 64, 00:18:29.707 "state": "configuring", 00:18:29.707 "raid_level": "raid0", 00:18:29.707 "superblock": false, 00:18:29.707 "num_base_bdevs": 4, 00:18:29.707 "num_base_bdevs_discovered": 2, 00:18:29.707 "num_base_bdevs_operational": 4, 00:18:29.707 "base_bdevs_list": [ 00:18:29.707 { 00:18:29.707 "name": "BaseBdev1", 00:18:29.707 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:29.707 "is_configured": true, 00:18:29.707 "data_offset": 0, 00:18:29.707 "data_size": 65536 00:18:29.707 }, 00:18:29.707 { 00:18:29.707 "name": "BaseBdev2", 00:18:29.707 "uuid": "249ad702-9f3c-497d-b87a-4949d6f19bdd", 00:18:29.707 "is_configured": true, 00:18:29.707 "data_offset": 0, 00:18:29.707 "data_size": 65536 00:18:29.707 }, 00:18:29.707 { 00:18:29.707 "name": "BaseBdev3", 00:18:29.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.707 "is_configured": false, 00:18:29.707 "data_offset": 0, 00:18:29.707 "data_size": 0 00:18:29.707 }, 00:18:29.707 { 00:18:29.707 "name": "BaseBdev4", 00:18:29.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.707 "is_configured": false, 00:18:29.707 "data_offset": 0, 00:18:29.707 "data_size": 0 00:18:29.707 } 00:18:29.707 ] 00:18:29.707 }' 00:18:29.707 10:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.707 10:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:30.272 10:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:30.530 [2024-07-15 10:26:07.628978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:30.530 BaseBdev3 00:18:30.530 10:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:30.530 10:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:30.530 10:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:30.530 10:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:30.530 10:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:30.530 10:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:30.530 10:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:30.789 10:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:31.047 [ 00:18:31.047 { 00:18:31.047 "name": "BaseBdev3", 00:18:31.047 "aliases": [ 00:18:31.047 "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5" 00:18:31.047 ], 00:18:31.047 "product_name": "Malloc disk", 00:18:31.047 "block_size": 512, 00:18:31.047 "num_blocks": 65536, 00:18:31.047 "uuid": "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5", 00:18:31.047 "assigned_rate_limits": { 00:18:31.047 "rw_ios_per_sec": 0, 00:18:31.047 "rw_mbytes_per_sec": 0, 00:18:31.047 "r_mbytes_per_sec": 0, 00:18:31.047 "w_mbytes_per_sec": 0 00:18:31.047 }, 00:18:31.047 "claimed": true, 00:18:31.047 "claim_type": "exclusive_write", 00:18:31.047 "zoned": false, 00:18:31.047 "supported_io_types": { 00:18:31.047 "read": true, 00:18:31.047 "write": true, 00:18:31.047 "unmap": true, 00:18:31.047 "flush": true, 00:18:31.047 "reset": true, 00:18:31.047 "nvme_admin": false, 00:18:31.047 "nvme_io": false, 00:18:31.047 "nvme_io_md": false, 00:18:31.047 "write_zeroes": true, 00:18:31.047 "zcopy": true, 00:18:31.047 "get_zone_info": false, 00:18:31.047 "zone_management": false, 00:18:31.047 "zone_append": false, 00:18:31.047 "compare": false, 00:18:31.047 "compare_and_write": false, 00:18:31.047 "abort": true, 00:18:31.047 "seek_hole": false, 00:18:31.047 "seek_data": false, 00:18:31.047 "copy": true, 00:18:31.047 "nvme_iov_md": false 00:18:31.047 }, 00:18:31.047 "memory_domains": [ 00:18:31.047 { 00:18:31.047 "dma_device_id": "system", 00:18:31.047 "dma_device_type": 1 00:18:31.047 }, 00:18:31.047 { 00:18:31.047 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.047 "dma_device_type": 2 00:18:31.047 } 00:18:31.047 ], 00:18:31.047 "driver_specific": {} 00:18:31.047 } 00:18:31.047 ] 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.047 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:31.306 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:31.306 "name": "Existed_Raid", 00:18:31.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.306 "strip_size_kb": 64, 00:18:31.306 "state": "configuring", 00:18:31.306 "raid_level": "raid0", 00:18:31.306 "superblock": false, 00:18:31.306 "num_base_bdevs": 4, 00:18:31.306 "num_base_bdevs_discovered": 3, 00:18:31.306 "num_base_bdevs_operational": 4, 00:18:31.306 "base_bdevs_list": [ 00:18:31.306 { 00:18:31.306 "name": "BaseBdev1", 00:18:31.306 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:31.306 "is_configured": true, 00:18:31.306 "data_offset": 0, 00:18:31.306 "data_size": 65536 00:18:31.306 }, 00:18:31.306 { 00:18:31.306 "name": "BaseBdev2", 00:18:31.306 "uuid": "249ad702-9f3c-497d-b87a-4949d6f19bdd", 00:18:31.306 "is_configured": true, 00:18:31.306 "data_offset": 0, 00:18:31.306 "data_size": 65536 00:18:31.306 }, 00:18:31.306 { 00:18:31.306 "name": "BaseBdev3", 00:18:31.306 "uuid": "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5", 00:18:31.306 "is_configured": true, 00:18:31.306 "data_offset": 0, 00:18:31.306 "data_size": 65536 00:18:31.306 }, 00:18:31.306 { 00:18:31.306 "name": "BaseBdev4", 00:18:31.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:31.306 "is_configured": false, 00:18:31.306 "data_offset": 0, 00:18:31.306 "data_size": 0 00:18:31.306 } 00:18:31.306 ] 00:18:31.306 }' 00:18:31.306 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:31.306 10:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.872 10:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:32.130 [2024-07-15 10:26:09.188522] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:32.130 [2024-07-15 10:26:09.188557] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa6c350 00:18:32.130 [2024-07-15 10:26:09.188565] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:32.130 [2024-07-15 10:26:09.188814] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa6c020 00:18:32.130 [2024-07-15 10:26:09.188948] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa6c350 00:18:32.130 [2024-07-15 10:26:09.188959] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa6c350 00:18:32.130 [2024-07-15 10:26:09.189125] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.130 BaseBdev4 00:18:32.130 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:32.130 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:32.130 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:32.130 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:32.130 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:32.130 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:32.130 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:32.387 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:32.644 [ 00:18:32.644 { 00:18:32.644 "name": "BaseBdev4", 00:18:32.644 "aliases": [ 00:18:32.644 "20b8548a-50e0-41eb-9385-dcfdb605327d" 00:18:32.644 ], 00:18:32.644 "product_name": "Malloc disk", 00:18:32.644 "block_size": 512, 00:18:32.644 "num_blocks": 65536, 00:18:32.644 "uuid": "20b8548a-50e0-41eb-9385-dcfdb605327d", 00:18:32.644 "assigned_rate_limits": { 00:18:32.644 "rw_ios_per_sec": 0, 00:18:32.644 "rw_mbytes_per_sec": 0, 00:18:32.644 "r_mbytes_per_sec": 0, 00:18:32.644 "w_mbytes_per_sec": 0 00:18:32.644 }, 00:18:32.644 "claimed": true, 00:18:32.644 "claim_type": "exclusive_write", 00:18:32.644 "zoned": false, 00:18:32.644 "supported_io_types": { 00:18:32.644 "read": true, 00:18:32.644 "write": true, 00:18:32.644 "unmap": true, 00:18:32.644 "flush": true, 00:18:32.644 "reset": true, 00:18:32.644 "nvme_admin": false, 00:18:32.644 "nvme_io": false, 00:18:32.644 "nvme_io_md": false, 00:18:32.644 "write_zeroes": true, 00:18:32.644 "zcopy": true, 00:18:32.644 "get_zone_info": false, 00:18:32.644 "zone_management": false, 00:18:32.644 "zone_append": false, 00:18:32.644 "compare": false, 00:18:32.644 "compare_and_write": false, 00:18:32.644 "abort": true, 00:18:32.644 "seek_hole": false, 00:18:32.644 "seek_data": false, 00:18:32.644 "copy": true, 00:18:32.644 "nvme_iov_md": false 00:18:32.644 }, 00:18:32.644 "memory_domains": [ 00:18:32.644 { 00:18:32.644 "dma_device_id": "system", 00:18:32.644 "dma_device_type": 1 00:18:32.644 }, 00:18:32.644 { 00:18:32.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.644 "dma_device_type": 2 00:18:32.644 } 00:18:32.644 ], 00:18:32.644 "driver_specific": {} 00:18:32.644 } 00:18:32.644 ] 00:18:32.644 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:32.644 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:32.644 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:32.644 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:32.644 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.645 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:32.902 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.902 "name": "Existed_Raid", 00:18:32.902 "uuid": "60b8839b-0219-49ff-8b56-ad1c38985e55", 00:18:32.902 "strip_size_kb": 64, 00:18:32.902 "state": "online", 00:18:32.902 "raid_level": "raid0", 00:18:32.902 "superblock": false, 00:18:32.902 "num_base_bdevs": 4, 00:18:32.902 "num_base_bdevs_discovered": 4, 00:18:32.902 "num_base_bdevs_operational": 4, 00:18:32.902 "base_bdevs_list": [ 00:18:32.902 { 00:18:32.902 "name": "BaseBdev1", 00:18:32.902 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:32.902 "is_configured": true, 00:18:32.902 "data_offset": 0, 00:18:32.902 "data_size": 65536 00:18:32.902 }, 00:18:32.902 { 00:18:32.902 "name": "BaseBdev2", 00:18:32.902 "uuid": "249ad702-9f3c-497d-b87a-4949d6f19bdd", 00:18:32.902 "is_configured": true, 00:18:32.902 "data_offset": 0, 00:18:32.902 "data_size": 65536 00:18:32.902 }, 00:18:32.902 { 00:18:32.902 "name": "BaseBdev3", 00:18:32.902 "uuid": "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5", 00:18:32.902 "is_configured": true, 00:18:32.902 "data_offset": 0, 00:18:32.902 "data_size": 65536 00:18:32.902 }, 00:18:32.902 { 00:18:32.902 "name": "BaseBdev4", 00:18:32.902 "uuid": "20b8548a-50e0-41eb-9385-dcfdb605327d", 00:18:32.902 "is_configured": true, 00:18:32.902 "data_offset": 0, 00:18:32.902 "data_size": 65536 00:18:32.902 } 00:18:32.902 ] 00:18:32.902 }' 00:18:32.902 10:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.902 10:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:33.465 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:33.722 [2024-07-15 10:26:10.708891] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:33.722 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:33.722 "name": "Existed_Raid", 00:18:33.722 "aliases": [ 00:18:33.722 "60b8839b-0219-49ff-8b56-ad1c38985e55" 00:18:33.722 ], 00:18:33.722 "product_name": "Raid Volume", 00:18:33.722 "block_size": 512, 00:18:33.722 "num_blocks": 262144, 00:18:33.722 "uuid": "60b8839b-0219-49ff-8b56-ad1c38985e55", 00:18:33.722 "assigned_rate_limits": { 00:18:33.722 "rw_ios_per_sec": 0, 00:18:33.722 "rw_mbytes_per_sec": 0, 00:18:33.722 "r_mbytes_per_sec": 0, 00:18:33.722 "w_mbytes_per_sec": 0 00:18:33.722 }, 00:18:33.722 "claimed": false, 00:18:33.722 "zoned": false, 00:18:33.722 "supported_io_types": { 00:18:33.722 "read": true, 00:18:33.722 "write": true, 00:18:33.722 "unmap": true, 00:18:33.722 "flush": true, 00:18:33.722 "reset": true, 00:18:33.722 "nvme_admin": false, 00:18:33.722 "nvme_io": false, 00:18:33.722 "nvme_io_md": false, 00:18:33.722 "write_zeroes": true, 00:18:33.722 "zcopy": false, 00:18:33.722 "get_zone_info": false, 00:18:33.722 "zone_management": false, 00:18:33.722 "zone_append": false, 00:18:33.722 "compare": false, 00:18:33.722 "compare_and_write": false, 00:18:33.722 "abort": false, 00:18:33.722 "seek_hole": false, 00:18:33.722 "seek_data": false, 00:18:33.722 "copy": false, 00:18:33.722 "nvme_iov_md": false 00:18:33.722 }, 00:18:33.722 "memory_domains": [ 00:18:33.722 { 00:18:33.722 "dma_device_id": "system", 00:18:33.722 "dma_device_type": 1 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.722 "dma_device_type": 2 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "dma_device_id": "system", 00:18:33.722 "dma_device_type": 1 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.722 "dma_device_type": 2 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "dma_device_id": "system", 00:18:33.722 "dma_device_type": 1 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.722 "dma_device_type": 2 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "dma_device_id": "system", 00:18:33.722 "dma_device_type": 1 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.722 "dma_device_type": 2 00:18:33.722 } 00:18:33.722 ], 00:18:33.722 "driver_specific": { 00:18:33.722 "raid": { 00:18:33.722 "uuid": "60b8839b-0219-49ff-8b56-ad1c38985e55", 00:18:33.722 "strip_size_kb": 64, 00:18:33.722 "state": "online", 00:18:33.722 "raid_level": "raid0", 00:18:33.722 "superblock": false, 00:18:33.722 "num_base_bdevs": 4, 00:18:33.722 "num_base_bdevs_discovered": 4, 00:18:33.722 "num_base_bdevs_operational": 4, 00:18:33.722 "base_bdevs_list": [ 00:18:33.722 { 00:18:33.722 "name": "BaseBdev1", 00:18:33.722 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:33.722 "is_configured": true, 00:18:33.722 "data_offset": 0, 00:18:33.722 "data_size": 65536 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "name": "BaseBdev2", 00:18:33.722 "uuid": "249ad702-9f3c-497d-b87a-4949d6f19bdd", 00:18:33.722 "is_configured": true, 00:18:33.722 "data_offset": 0, 00:18:33.722 "data_size": 65536 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "name": "BaseBdev3", 00:18:33.722 "uuid": "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5", 00:18:33.722 "is_configured": true, 00:18:33.722 "data_offset": 0, 00:18:33.722 "data_size": 65536 00:18:33.722 }, 00:18:33.722 { 00:18:33.722 "name": "BaseBdev4", 00:18:33.722 "uuid": "20b8548a-50e0-41eb-9385-dcfdb605327d", 00:18:33.722 "is_configured": true, 00:18:33.722 "data_offset": 0, 00:18:33.722 "data_size": 65536 00:18:33.722 } 00:18:33.722 ] 00:18:33.722 } 00:18:33.722 } 00:18:33.722 }' 00:18:33.722 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:33.722 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:33.722 BaseBdev2 00:18:33.722 BaseBdev3 00:18:33.722 BaseBdev4' 00:18:33.722 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:33.722 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:33.722 10:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:33.980 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:33.980 "name": "BaseBdev1", 00:18:33.980 "aliases": [ 00:18:33.980 "e155c356-ca60-4695-bfbd-e40574856b2a" 00:18:33.980 ], 00:18:33.980 "product_name": "Malloc disk", 00:18:33.980 "block_size": 512, 00:18:33.980 "num_blocks": 65536, 00:18:33.980 "uuid": "e155c356-ca60-4695-bfbd-e40574856b2a", 00:18:33.980 "assigned_rate_limits": { 00:18:33.980 "rw_ios_per_sec": 0, 00:18:33.980 "rw_mbytes_per_sec": 0, 00:18:33.980 "r_mbytes_per_sec": 0, 00:18:33.980 "w_mbytes_per_sec": 0 00:18:33.980 }, 00:18:33.980 "claimed": true, 00:18:33.980 "claim_type": "exclusive_write", 00:18:33.980 "zoned": false, 00:18:33.980 "supported_io_types": { 00:18:33.980 "read": true, 00:18:33.980 "write": true, 00:18:33.980 "unmap": true, 00:18:33.980 "flush": true, 00:18:33.980 "reset": true, 00:18:33.980 "nvme_admin": false, 00:18:33.980 "nvme_io": false, 00:18:33.980 "nvme_io_md": false, 00:18:33.980 "write_zeroes": true, 00:18:33.980 "zcopy": true, 00:18:33.980 "get_zone_info": false, 00:18:33.980 "zone_management": false, 00:18:33.980 "zone_append": false, 00:18:33.980 "compare": false, 00:18:33.980 "compare_and_write": false, 00:18:33.980 "abort": true, 00:18:33.980 "seek_hole": false, 00:18:33.980 "seek_data": false, 00:18:33.980 "copy": true, 00:18:33.980 "nvme_iov_md": false 00:18:33.980 }, 00:18:33.980 "memory_domains": [ 00:18:33.980 { 00:18:33.980 "dma_device_id": "system", 00:18:33.980 "dma_device_type": 1 00:18:33.980 }, 00:18:33.980 { 00:18:33.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.980 "dma_device_type": 2 00:18:33.980 } 00:18:33.980 ], 00:18:33.980 "driver_specific": {} 00:18:33.980 }' 00:18:33.980 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.980 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.980 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:33.980 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.980 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:34.238 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:34.495 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:34.495 "name": "BaseBdev2", 00:18:34.495 "aliases": [ 00:18:34.495 "249ad702-9f3c-497d-b87a-4949d6f19bdd" 00:18:34.495 ], 00:18:34.495 "product_name": "Malloc disk", 00:18:34.495 "block_size": 512, 00:18:34.495 "num_blocks": 65536, 00:18:34.495 "uuid": "249ad702-9f3c-497d-b87a-4949d6f19bdd", 00:18:34.495 "assigned_rate_limits": { 00:18:34.495 "rw_ios_per_sec": 0, 00:18:34.496 "rw_mbytes_per_sec": 0, 00:18:34.496 "r_mbytes_per_sec": 0, 00:18:34.496 "w_mbytes_per_sec": 0 00:18:34.496 }, 00:18:34.496 "claimed": true, 00:18:34.496 "claim_type": "exclusive_write", 00:18:34.496 "zoned": false, 00:18:34.496 "supported_io_types": { 00:18:34.496 "read": true, 00:18:34.496 "write": true, 00:18:34.496 "unmap": true, 00:18:34.496 "flush": true, 00:18:34.496 "reset": true, 00:18:34.496 "nvme_admin": false, 00:18:34.496 "nvme_io": false, 00:18:34.496 "nvme_io_md": false, 00:18:34.496 "write_zeroes": true, 00:18:34.496 "zcopy": true, 00:18:34.496 "get_zone_info": false, 00:18:34.496 "zone_management": false, 00:18:34.496 "zone_append": false, 00:18:34.496 "compare": false, 00:18:34.496 "compare_and_write": false, 00:18:34.496 "abort": true, 00:18:34.496 "seek_hole": false, 00:18:34.496 "seek_data": false, 00:18:34.496 "copy": true, 00:18:34.496 "nvme_iov_md": false 00:18:34.496 }, 00:18:34.496 "memory_domains": [ 00:18:34.496 { 00:18:34.496 "dma_device_id": "system", 00:18:34.496 "dma_device_type": 1 00:18:34.496 }, 00:18:34.496 { 00:18:34.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.496 "dma_device_type": 2 00:18:34.496 } 00:18:34.496 ], 00:18:34.496 "driver_specific": {} 00:18:34.496 }' 00:18:34.496 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.496 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:34.753 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:35.011 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:35.011 10:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:35.011 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:35.011 "name": "BaseBdev3", 00:18:35.011 "aliases": [ 00:18:35.011 "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5" 00:18:35.011 ], 00:18:35.011 "product_name": "Malloc disk", 00:18:35.011 "block_size": 512, 00:18:35.011 "num_blocks": 65536, 00:18:35.011 "uuid": "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5", 00:18:35.011 "assigned_rate_limits": { 00:18:35.011 "rw_ios_per_sec": 0, 00:18:35.011 "rw_mbytes_per_sec": 0, 00:18:35.011 "r_mbytes_per_sec": 0, 00:18:35.011 "w_mbytes_per_sec": 0 00:18:35.011 }, 00:18:35.011 "claimed": true, 00:18:35.011 "claim_type": "exclusive_write", 00:18:35.011 "zoned": false, 00:18:35.011 "supported_io_types": { 00:18:35.011 "read": true, 00:18:35.011 "write": true, 00:18:35.011 "unmap": true, 00:18:35.011 "flush": true, 00:18:35.011 "reset": true, 00:18:35.011 "nvme_admin": false, 00:18:35.011 "nvme_io": false, 00:18:35.011 "nvme_io_md": false, 00:18:35.011 "write_zeroes": true, 00:18:35.011 "zcopy": true, 00:18:35.011 "get_zone_info": false, 00:18:35.011 "zone_management": false, 00:18:35.011 "zone_append": false, 00:18:35.011 "compare": false, 00:18:35.011 "compare_and_write": false, 00:18:35.011 "abort": true, 00:18:35.011 "seek_hole": false, 00:18:35.011 "seek_data": false, 00:18:35.011 "copy": true, 00:18:35.011 "nvme_iov_md": false 00:18:35.011 }, 00:18:35.011 "memory_domains": [ 00:18:35.011 { 00:18:35.011 "dma_device_id": "system", 00:18:35.011 "dma_device_type": 1 00:18:35.011 }, 00:18:35.011 { 00:18:35.011 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.011 "dma_device_type": 2 00:18:35.011 } 00:18:35.011 ], 00:18:35.011 "driver_specific": {} 00:18:35.011 }' 00:18:35.011 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:35.269 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.527 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.527 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:35.527 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:35.527 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:35.527 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:35.785 "name": "BaseBdev4", 00:18:35.785 "aliases": [ 00:18:35.785 "20b8548a-50e0-41eb-9385-dcfdb605327d" 00:18:35.785 ], 00:18:35.785 "product_name": "Malloc disk", 00:18:35.785 "block_size": 512, 00:18:35.785 "num_blocks": 65536, 00:18:35.785 "uuid": "20b8548a-50e0-41eb-9385-dcfdb605327d", 00:18:35.785 "assigned_rate_limits": { 00:18:35.785 "rw_ios_per_sec": 0, 00:18:35.785 "rw_mbytes_per_sec": 0, 00:18:35.785 "r_mbytes_per_sec": 0, 00:18:35.785 "w_mbytes_per_sec": 0 00:18:35.785 }, 00:18:35.785 "claimed": true, 00:18:35.785 "claim_type": "exclusive_write", 00:18:35.785 "zoned": false, 00:18:35.785 "supported_io_types": { 00:18:35.785 "read": true, 00:18:35.785 "write": true, 00:18:35.785 "unmap": true, 00:18:35.785 "flush": true, 00:18:35.785 "reset": true, 00:18:35.785 "nvme_admin": false, 00:18:35.785 "nvme_io": false, 00:18:35.785 "nvme_io_md": false, 00:18:35.785 "write_zeroes": true, 00:18:35.785 "zcopy": true, 00:18:35.785 "get_zone_info": false, 00:18:35.785 "zone_management": false, 00:18:35.785 "zone_append": false, 00:18:35.785 "compare": false, 00:18:35.785 "compare_and_write": false, 00:18:35.785 "abort": true, 00:18:35.785 "seek_hole": false, 00:18:35.785 "seek_data": false, 00:18:35.785 "copy": true, 00:18:35.785 "nvme_iov_md": false 00:18:35.785 }, 00:18:35.785 "memory_domains": [ 00:18:35.785 { 00:18:35.785 "dma_device_id": "system", 00:18:35.785 "dma_device_type": 1 00:18:35.785 }, 00:18:35.785 { 00:18:35.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.785 "dma_device_type": 2 00:18:35.785 } 00:18:35.785 ], 00:18:35.785 "driver_specific": {} 00:18:35.785 }' 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:35.785 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.786 10:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.043 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:36.043 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.043 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.043 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:36.043 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:36.300 [2024-07-15 10:26:13.319545] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:36.300 [2024-07-15 10:26:13.319572] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:36.300 [2024-07-15 10:26:13.319625] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.300 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.557 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.557 "name": "Existed_Raid", 00:18:36.557 "uuid": "60b8839b-0219-49ff-8b56-ad1c38985e55", 00:18:36.557 "strip_size_kb": 64, 00:18:36.557 "state": "offline", 00:18:36.557 "raid_level": "raid0", 00:18:36.557 "superblock": false, 00:18:36.557 "num_base_bdevs": 4, 00:18:36.557 "num_base_bdevs_discovered": 3, 00:18:36.557 "num_base_bdevs_operational": 3, 00:18:36.557 "base_bdevs_list": [ 00:18:36.557 { 00:18:36.557 "name": null, 00:18:36.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:36.557 "is_configured": false, 00:18:36.557 "data_offset": 0, 00:18:36.557 "data_size": 65536 00:18:36.557 }, 00:18:36.557 { 00:18:36.557 "name": "BaseBdev2", 00:18:36.557 "uuid": "249ad702-9f3c-497d-b87a-4949d6f19bdd", 00:18:36.557 "is_configured": true, 00:18:36.557 "data_offset": 0, 00:18:36.557 "data_size": 65536 00:18:36.557 }, 00:18:36.557 { 00:18:36.557 "name": "BaseBdev3", 00:18:36.557 "uuid": "7ccb5767-fcc6-4b94-bb76-20c71d2a4bb5", 00:18:36.557 "is_configured": true, 00:18:36.557 "data_offset": 0, 00:18:36.557 "data_size": 65536 00:18:36.557 }, 00:18:36.557 { 00:18:36.557 "name": "BaseBdev4", 00:18:36.557 "uuid": "20b8548a-50e0-41eb-9385-dcfdb605327d", 00:18:36.557 "is_configured": true, 00:18:36.557 "data_offset": 0, 00:18:36.557 "data_size": 65536 00:18:36.557 } 00:18:36.557 ] 00:18:36.557 }' 00:18:36.557 10:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.557 10:26:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:37.123 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:37.123 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:37.123 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.123 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:37.412 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:37.412 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:37.412 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:37.674 [2024-07-15 10:26:14.668426] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:37.674 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:37.674 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:37.674 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.674 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:37.931 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:37.931 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:37.931 10:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:38.189 [2024-07-15 10:26:15.166154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:38.189 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:38.189 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:38.189 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:38.189 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.447 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:38.447 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:38.447 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:38.705 [2024-07-15 10:26:15.665916] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:38.705 [2024-07-15 10:26:15.665964] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa6c350 name Existed_Raid, state offline 00:18:38.705 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:38.705 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:38.705 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:38.705 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.963 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:38.963 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:38.963 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:38.963 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:38.963 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:38.963 10:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:39.221 BaseBdev2 00:18:39.221 10:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:39.221 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:39.221 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:39.221 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:39.221 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:39.221 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:39.221 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:39.479 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:39.479 [ 00:18:39.479 { 00:18:39.479 "name": "BaseBdev2", 00:18:39.479 "aliases": [ 00:18:39.479 "0164acc0-0e24-4f27-8fdb-ba265e17b431" 00:18:39.479 ], 00:18:39.479 "product_name": "Malloc disk", 00:18:39.479 "block_size": 512, 00:18:39.479 "num_blocks": 65536, 00:18:39.479 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:39.479 "assigned_rate_limits": { 00:18:39.479 "rw_ios_per_sec": 0, 00:18:39.479 "rw_mbytes_per_sec": 0, 00:18:39.479 "r_mbytes_per_sec": 0, 00:18:39.479 "w_mbytes_per_sec": 0 00:18:39.479 }, 00:18:39.479 "claimed": false, 00:18:39.479 "zoned": false, 00:18:39.479 "supported_io_types": { 00:18:39.479 "read": true, 00:18:39.479 "write": true, 00:18:39.479 "unmap": true, 00:18:39.479 "flush": true, 00:18:39.479 "reset": true, 00:18:39.479 "nvme_admin": false, 00:18:39.479 "nvme_io": false, 00:18:39.479 "nvme_io_md": false, 00:18:39.479 "write_zeroes": true, 00:18:39.479 "zcopy": true, 00:18:39.479 "get_zone_info": false, 00:18:39.479 "zone_management": false, 00:18:39.479 "zone_append": false, 00:18:39.479 "compare": false, 00:18:39.479 "compare_and_write": false, 00:18:39.479 "abort": true, 00:18:39.479 "seek_hole": false, 00:18:39.479 "seek_data": false, 00:18:39.479 "copy": true, 00:18:39.479 "nvme_iov_md": false 00:18:39.479 }, 00:18:39.479 "memory_domains": [ 00:18:39.479 { 00:18:39.479 "dma_device_id": "system", 00:18:39.479 "dma_device_type": 1 00:18:39.479 }, 00:18:39.479 { 00:18:39.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.480 "dma_device_type": 2 00:18:39.480 } 00:18:39.480 ], 00:18:39.480 "driver_specific": {} 00:18:39.480 } 00:18:39.480 ] 00:18:39.480 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:39.480 10:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:39.480 10:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:39.480 10:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:39.737 BaseBdev3 00:18:39.737 10:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:39.737 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:39.737 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:39.737 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:39.737 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:39.737 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:39.737 10:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:39.995 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:40.253 [ 00:18:40.253 { 00:18:40.253 "name": "BaseBdev3", 00:18:40.253 "aliases": [ 00:18:40.253 "074241e3-a670-4052-9fce-ec28e3262e50" 00:18:40.253 ], 00:18:40.253 "product_name": "Malloc disk", 00:18:40.253 "block_size": 512, 00:18:40.253 "num_blocks": 65536, 00:18:40.253 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:40.253 "assigned_rate_limits": { 00:18:40.253 "rw_ios_per_sec": 0, 00:18:40.253 "rw_mbytes_per_sec": 0, 00:18:40.253 "r_mbytes_per_sec": 0, 00:18:40.253 "w_mbytes_per_sec": 0 00:18:40.253 }, 00:18:40.253 "claimed": false, 00:18:40.253 "zoned": false, 00:18:40.253 "supported_io_types": { 00:18:40.253 "read": true, 00:18:40.253 "write": true, 00:18:40.253 "unmap": true, 00:18:40.253 "flush": true, 00:18:40.253 "reset": true, 00:18:40.253 "nvme_admin": false, 00:18:40.253 "nvme_io": false, 00:18:40.253 "nvme_io_md": false, 00:18:40.253 "write_zeroes": true, 00:18:40.253 "zcopy": true, 00:18:40.253 "get_zone_info": false, 00:18:40.253 "zone_management": false, 00:18:40.253 "zone_append": false, 00:18:40.253 "compare": false, 00:18:40.253 "compare_and_write": false, 00:18:40.253 "abort": true, 00:18:40.253 "seek_hole": false, 00:18:40.253 "seek_data": false, 00:18:40.253 "copy": true, 00:18:40.253 "nvme_iov_md": false 00:18:40.253 }, 00:18:40.253 "memory_domains": [ 00:18:40.253 { 00:18:40.253 "dma_device_id": "system", 00:18:40.253 "dma_device_type": 1 00:18:40.253 }, 00:18:40.253 { 00:18:40.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.253 "dma_device_type": 2 00:18:40.253 } 00:18:40.253 ], 00:18:40.253 "driver_specific": {} 00:18:40.253 } 00:18:40.253 ] 00:18:40.253 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:40.253 10:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:40.253 10:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:40.253 10:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:40.511 BaseBdev4 00:18:40.511 10:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:40.511 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:40.511 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:40.511 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:40.511 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:40.511 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:40.511 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:40.769 10:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:41.027 [ 00:18:41.027 { 00:18:41.027 "name": "BaseBdev4", 00:18:41.027 "aliases": [ 00:18:41.027 "224cee66-ca02-4880-9a71-e3c61437c695" 00:18:41.027 ], 00:18:41.027 "product_name": "Malloc disk", 00:18:41.027 "block_size": 512, 00:18:41.027 "num_blocks": 65536, 00:18:41.027 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:41.027 "assigned_rate_limits": { 00:18:41.027 "rw_ios_per_sec": 0, 00:18:41.027 "rw_mbytes_per_sec": 0, 00:18:41.027 "r_mbytes_per_sec": 0, 00:18:41.027 "w_mbytes_per_sec": 0 00:18:41.027 }, 00:18:41.027 "claimed": false, 00:18:41.027 "zoned": false, 00:18:41.027 "supported_io_types": { 00:18:41.027 "read": true, 00:18:41.027 "write": true, 00:18:41.027 "unmap": true, 00:18:41.027 "flush": true, 00:18:41.027 "reset": true, 00:18:41.027 "nvme_admin": false, 00:18:41.027 "nvme_io": false, 00:18:41.027 "nvme_io_md": false, 00:18:41.027 "write_zeroes": true, 00:18:41.027 "zcopy": true, 00:18:41.027 "get_zone_info": false, 00:18:41.027 "zone_management": false, 00:18:41.027 "zone_append": false, 00:18:41.027 "compare": false, 00:18:41.027 "compare_and_write": false, 00:18:41.027 "abort": true, 00:18:41.027 "seek_hole": false, 00:18:41.027 "seek_data": false, 00:18:41.027 "copy": true, 00:18:41.027 "nvme_iov_md": false 00:18:41.027 }, 00:18:41.027 "memory_domains": [ 00:18:41.027 { 00:18:41.027 "dma_device_id": "system", 00:18:41.027 "dma_device_type": 1 00:18:41.027 }, 00:18:41.027 { 00:18:41.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.027 "dma_device_type": 2 00:18:41.027 } 00:18:41.027 ], 00:18:41.027 "driver_specific": {} 00:18:41.027 } 00:18:41.027 ] 00:18:41.028 10:26:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:41.028 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:41.028 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:41.028 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:41.286 [2024-07-15 10:26:18.353688] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:41.286 [2024-07-15 10:26:18.353727] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:41.286 [2024-07-15 10:26:18.353747] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:41.286 [2024-07-15 10:26:18.355122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:41.286 [2024-07-15 10:26:18.355163] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.286 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.544 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.544 "name": "Existed_Raid", 00:18:41.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.544 "strip_size_kb": 64, 00:18:41.544 "state": "configuring", 00:18:41.544 "raid_level": "raid0", 00:18:41.544 "superblock": false, 00:18:41.544 "num_base_bdevs": 4, 00:18:41.544 "num_base_bdevs_discovered": 3, 00:18:41.544 "num_base_bdevs_operational": 4, 00:18:41.544 "base_bdevs_list": [ 00:18:41.544 { 00:18:41.544 "name": "BaseBdev1", 00:18:41.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.544 "is_configured": false, 00:18:41.544 "data_offset": 0, 00:18:41.544 "data_size": 0 00:18:41.544 }, 00:18:41.544 { 00:18:41.544 "name": "BaseBdev2", 00:18:41.544 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:41.544 "is_configured": true, 00:18:41.544 "data_offset": 0, 00:18:41.544 "data_size": 65536 00:18:41.544 }, 00:18:41.544 { 00:18:41.544 "name": "BaseBdev3", 00:18:41.544 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:41.544 "is_configured": true, 00:18:41.544 "data_offset": 0, 00:18:41.544 "data_size": 65536 00:18:41.544 }, 00:18:41.544 { 00:18:41.544 "name": "BaseBdev4", 00:18:41.544 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:41.544 "is_configured": true, 00:18:41.544 "data_offset": 0, 00:18:41.544 "data_size": 65536 00:18:41.544 } 00:18:41.544 ] 00:18:41.544 }' 00:18:41.544 10:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.544 10:26:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:42.110 [2024-07-15 10:26:19.264141] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.110 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.368 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.368 "name": "Existed_Raid", 00:18:42.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.368 "strip_size_kb": 64, 00:18:42.368 "state": "configuring", 00:18:42.368 "raid_level": "raid0", 00:18:42.368 "superblock": false, 00:18:42.368 "num_base_bdevs": 4, 00:18:42.368 "num_base_bdevs_discovered": 2, 00:18:42.368 "num_base_bdevs_operational": 4, 00:18:42.368 "base_bdevs_list": [ 00:18:42.368 { 00:18:42.368 "name": "BaseBdev1", 00:18:42.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.368 "is_configured": false, 00:18:42.368 "data_offset": 0, 00:18:42.368 "data_size": 0 00:18:42.368 }, 00:18:42.368 { 00:18:42.368 "name": null, 00:18:42.368 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:42.368 "is_configured": false, 00:18:42.368 "data_offset": 0, 00:18:42.368 "data_size": 65536 00:18:42.368 }, 00:18:42.368 { 00:18:42.368 "name": "BaseBdev3", 00:18:42.368 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:42.368 "is_configured": true, 00:18:42.368 "data_offset": 0, 00:18:42.368 "data_size": 65536 00:18:42.368 }, 00:18:42.368 { 00:18:42.368 "name": "BaseBdev4", 00:18:42.368 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:42.368 "is_configured": true, 00:18:42.368 "data_offset": 0, 00:18:42.368 "data_size": 65536 00:18:42.368 } 00:18:42.368 ] 00:18:42.368 }' 00:18:42.368 10:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.368 10:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:42.935 10:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.935 10:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:43.193 10:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:43.193 10:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:43.761 [2024-07-15 10:26:20.787516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:43.761 BaseBdev1 00:18:43.761 10:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:43.761 10:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:43.761 10:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:43.761 10:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:43.761 10:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:43.761 10:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:43.761 10:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:44.020 10:26:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:44.586 [ 00:18:44.586 { 00:18:44.586 "name": "BaseBdev1", 00:18:44.586 "aliases": [ 00:18:44.586 "f792e8cd-aa61-411c-88d5-3c9f4bb47a31" 00:18:44.586 ], 00:18:44.586 "product_name": "Malloc disk", 00:18:44.586 "block_size": 512, 00:18:44.586 "num_blocks": 65536, 00:18:44.586 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:44.586 "assigned_rate_limits": { 00:18:44.586 "rw_ios_per_sec": 0, 00:18:44.586 "rw_mbytes_per_sec": 0, 00:18:44.586 "r_mbytes_per_sec": 0, 00:18:44.586 "w_mbytes_per_sec": 0 00:18:44.586 }, 00:18:44.586 "claimed": true, 00:18:44.586 "claim_type": "exclusive_write", 00:18:44.586 "zoned": false, 00:18:44.586 "supported_io_types": { 00:18:44.586 "read": true, 00:18:44.586 "write": true, 00:18:44.586 "unmap": true, 00:18:44.586 "flush": true, 00:18:44.586 "reset": true, 00:18:44.586 "nvme_admin": false, 00:18:44.586 "nvme_io": false, 00:18:44.586 "nvme_io_md": false, 00:18:44.586 "write_zeroes": true, 00:18:44.586 "zcopy": true, 00:18:44.586 "get_zone_info": false, 00:18:44.586 "zone_management": false, 00:18:44.586 "zone_append": false, 00:18:44.586 "compare": false, 00:18:44.586 "compare_and_write": false, 00:18:44.586 "abort": true, 00:18:44.586 "seek_hole": false, 00:18:44.586 "seek_data": false, 00:18:44.586 "copy": true, 00:18:44.586 "nvme_iov_md": false 00:18:44.586 }, 00:18:44.586 "memory_domains": [ 00:18:44.586 { 00:18:44.586 "dma_device_id": "system", 00:18:44.586 "dma_device_type": 1 00:18:44.586 }, 00:18:44.586 { 00:18:44.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.587 "dma_device_type": 2 00:18:44.587 } 00:18:44.587 ], 00:18:44.587 "driver_specific": {} 00:18:44.587 } 00:18:44.587 ] 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.587 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:44.846 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.846 "name": "Existed_Raid", 00:18:44.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:44.846 "strip_size_kb": 64, 00:18:44.846 "state": "configuring", 00:18:44.846 "raid_level": "raid0", 00:18:44.846 "superblock": false, 00:18:44.846 "num_base_bdevs": 4, 00:18:44.846 "num_base_bdevs_discovered": 3, 00:18:44.846 "num_base_bdevs_operational": 4, 00:18:44.846 "base_bdevs_list": [ 00:18:44.846 { 00:18:44.846 "name": "BaseBdev1", 00:18:44.846 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:44.846 "is_configured": true, 00:18:44.846 "data_offset": 0, 00:18:44.846 "data_size": 65536 00:18:44.846 }, 00:18:44.846 { 00:18:44.846 "name": null, 00:18:44.846 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:44.846 "is_configured": false, 00:18:44.846 "data_offset": 0, 00:18:44.846 "data_size": 65536 00:18:44.846 }, 00:18:44.846 { 00:18:44.846 "name": "BaseBdev3", 00:18:44.846 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:44.846 "is_configured": true, 00:18:44.846 "data_offset": 0, 00:18:44.846 "data_size": 65536 00:18:44.846 }, 00:18:44.846 { 00:18:44.846 "name": "BaseBdev4", 00:18:44.846 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:44.846 "is_configured": true, 00:18:44.846 "data_offset": 0, 00:18:44.846 "data_size": 65536 00:18:44.846 } 00:18:44.846 ] 00:18:44.846 }' 00:18:44.846 10:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.846 10:26:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.413 10:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.413 10:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:45.672 10:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:45.672 10:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:46.246 [2024-07-15 10:26:23.129790] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:46.246 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.246 "name": "Existed_Raid", 00:18:46.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:46.246 "strip_size_kb": 64, 00:18:46.246 "state": "configuring", 00:18:46.246 "raid_level": "raid0", 00:18:46.246 "superblock": false, 00:18:46.246 "num_base_bdevs": 4, 00:18:46.246 "num_base_bdevs_discovered": 2, 00:18:46.246 "num_base_bdevs_operational": 4, 00:18:46.246 "base_bdevs_list": [ 00:18:46.246 { 00:18:46.246 "name": "BaseBdev1", 00:18:46.246 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:46.246 "is_configured": true, 00:18:46.246 "data_offset": 0, 00:18:46.246 "data_size": 65536 00:18:46.246 }, 00:18:46.246 { 00:18:46.246 "name": null, 00:18:46.246 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:46.246 "is_configured": false, 00:18:46.246 "data_offset": 0, 00:18:46.246 "data_size": 65536 00:18:46.246 }, 00:18:46.246 { 00:18:46.246 "name": null, 00:18:46.247 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:46.247 "is_configured": false, 00:18:46.247 "data_offset": 0, 00:18:46.247 "data_size": 65536 00:18:46.247 }, 00:18:46.247 { 00:18:46.247 "name": "BaseBdev4", 00:18:46.247 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:46.247 "is_configured": true, 00:18:46.247 "data_offset": 0, 00:18:46.247 "data_size": 65536 00:18:46.247 } 00:18:46.247 ] 00:18:46.247 }' 00:18:46.247 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.247 10:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.818 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.818 10:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:47.078 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:47.078 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:47.336 [2024-07-15 10:26:24.457315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.336 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.337 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.337 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.595 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.595 "name": "Existed_Raid", 00:18:47.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:47.595 "strip_size_kb": 64, 00:18:47.595 "state": "configuring", 00:18:47.595 "raid_level": "raid0", 00:18:47.595 "superblock": false, 00:18:47.595 "num_base_bdevs": 4, 00:18:47.595 "num_base_bdevs_discovered": 3, 00:18:47.595 "num_base_bdevs_operational": 4, 00:18:47.595 "base_bdevs_list": [ 00:18:47.595 { 00:18:47.595 "name": "BaseBdev1", 00:18:47.595 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:47.595 "is_configured": true, 00:18:47.595 "data_offset": 0, 00:18:47.595 "data_size": 65536 00:18:47.595 }, 00:18:47.595 { 00:18:47.595 "name": null, 00:18:47.595 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:47.595 "is_configured": false, 00:18:47.595 "data_offset": 0, 00:18:47.595 "data_size": 65536 00:18:47.595 }, 00:18:47.595 { 00:18:47.595 "name": "BaseBdev3", 00:18:47.595 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:47.595 "is_configured": true, 00:18:47.595 "data_offset": 0, 00:18:47.595 "data_size": 65536 00:18:47.595 }, 00:18:47.595 { 00:18:47.595 "name": "BaseBdev4", 00:18:47.595 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:47.595 "is_configured": true, 00:18:47.595 "data_offset": 0, 00:18:47.595 "data_size": 65536 00:18:47.595 } 00:18:47.595 ] 00:18:47.595 }' 00:18:47.595 10:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.595 10:26:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.161 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.161 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:48.420 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:48.420 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:48.679 [2024-07-15 10:26:25.792859] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.679 10:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.938 10:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.938 "name": "Existed_Raid", 00:18:48.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.938 "strip_size_kb": 64, 00:18:48.938 "state": "configuring", 00:18:48.938 "raid_level": "raid0", 00:18:48.938 "superblock": false, 00:18:48.938 "num_base_bdevs": 4, 00:18:48.938 "num_base_bdevs_discovered": 2, 00:18:48.938 "num_base_bdevs_operational": 4, 00:18:48.938 "base_bdevs_list": [ 00:18:48.938 { 00:18:48.938 "name": null, 00:18:48.938 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:48.938 "is_configured": false, 00:18:48.938 "data_offset": 0, 00:18:48.938 "data_size": 65536 00:18:48.938 }, 00:18:48.938 { 00:18:48.938 "name": null, 00:18:48.938 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:48.938 "is_configured": false, 00:18:48.938 "data_offset": 0, 00:18:48.938 "data_size": 65536 00:18:48.938 }, 00:18:48.938 { 00:18:48.938 "name": "BaseBdev3", 00:18:48.938 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:48.938 "is_configured": true, 00:18:48.938 "data_offset": 0, 00:18:48.938 "data_size": 65536 00:18:48.938 }, 00:18:48.938 { 00:18:48.938 "name": "BaseBdev4", 00:18:48.938 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:48.938 "is_configured": true, 00:18:48.938 "data_offset": 0, 00:18:48.938 "data_size": 65536 00:18:48.938 } 00:18:48.938 ] 00:18:48.938 }' 00:18:48.938 10:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.938 10:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:49.505 10:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.505 10:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:49.763 10:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:49.763 10:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:50.023 [2024-07-15 10:26:27.074908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.023 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.282 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.282 "name": "Existed_Raid", 00:18:50.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.282 "strip_size_kb": 64, 00:18:50.282 "state": "configuring", 00:18:50.282 "raid_level": "raid0", 00:18:50.282 "superblock": false, 00:18:50.282 "num_base_bdevs": 4, 00:18:50.282 "num_base_bdevs_discovered": 3, 00:18:50.282 "num_base_bdevs_operational": 4, 00:18:50.282 "base_bdevs_list": [ 00:18:50.282 { 00:18:50.282 "name": null, 00:18:50.282 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:50.282 "is_configured": false, 00:18:50.282 "data_offset": 0, 00:18:50.282 "data_size": 65536 00:18:50.282 }, 00:18:50.282 { 00:18:50.282 "name": "BaseBdev2", 00:18:50.282 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:50.282 "is_configured": true, 00:18:50.282 "data_offset": 0, 00:18:50.282 "data_size": 65536 00:18:50.282 }, 00:18:50.282 { 00:18:50.282 "name": "BaseBdev3", 00:18:50.282 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:50.282 "is_configured": true, 00:18:50.282 "data_offset": 0, 00:18:50.282 "data_size": 65536 00:18:50.282 }, 00:18:50.282 { 00:18:50.282 "name": "BaseBdev4", 00:18:50.282 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:50.282 "is_configured": true, 00:18:50.282 "data_offset": 0, 00:18:50.282 "data_size": 65536 00:18:50.282 } 00:18:50.282 ] 00:18:50.282 }' 00:18:50.282 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.282 10:26:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:50.850 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.850 10:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:51.109 10:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:51.109 10:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.109 10:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:51.368 10:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f792e8cd-aa61-411c-88d5-3c9f4bb47a31 00:18:51.626 [2024-07-15 10:26:28.695683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:51.627 [2024-07-15 10:26:28.695721] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa70040 00:18:51.627 [2024-07-15 10:26:28.695729] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:51.627 [2024-07-15 10:26:28.695937] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa6ba70 00:18:51.627 [2024-07-15 10:26:28.696058] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa70040 00:18:51.627 [2024-07-15 10:26:28.696068] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa70040 00:18:51.627 [2024-07-15 10:26:28.696229] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:51.627 NewBaseBdev 00:18:51.627 10:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:51.627 10:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:51.627 10:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:51.627 10:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:51.627 10:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:51.627 10:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:51.627 10:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:51.884 10:26:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:52.142 [ 00:18:52.142 { 00:18:52.142 "name": "NewBaseBdev", 00:18:52.142 "aliases": [ 00:18:52.142 "f792e8cd-aa61-411c-88d5-3c9f4bb47a31" 00:18:52.142 ], 00:18:52.142 "product_name": "Malloc disk", 00:18:52.142 "block_size": 512, 00:18:52.142 "num_blocks": 65536, 00:18:52.142 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:52.142 "assigned_rate_limits": { 00:18:52.142 "rw_ios_per_sec": 0, 00:18:52.142 "rw_mbytes_per_sec": 0, 00:18:52.142 "r_mbytes_per_sec": 0, 00:18:52.142 "w_mbytes_per_sec": 0 00:18:52.142 }, 00:18:52.142 "claimed": true, 00:18:52.142 "claim_type": "exclusive_write", 00:18:52.142 "zoned": false, 00:18:52.142 "supported_io_types": { 00:18:52.142 "read": true, 00:18:52.142 "write": true, 00:18:52.142 "unmap": true, 00:18:52.142 "flush": true, 00:18:52.142 "reset": true, 00:18:52.142 "nvme_admin": false, 00:18:52.142 "nvme_io": false, 00:18:52.142 "nvme_io_md": false, 00:18:52.142 "write_zeroes": true, 00:18:52.142 "zcopy": true, 00:18:52.142 "get_zone_info": false, 00:18:52.142 "zone_management": false, 00:18:52.142 "zone_append": false, 00:18:52.142 "compare": false, 00:18:52.142 "compare_and_write": false, 00:18:52.142 "abort": true, 00:18:52.142 "seek_hole": false, 00:18:52.142 "seek_data": false, 00:18:52.142 "copy": true, 00:18:52.142 "nvme_iov_md": false 00:18:52.142 }, 00:18:52.142 "memory_domains": [ 00:18:52.142 { 00:18:52.142 "dma_device_id": "system", 00:18:52.142 "dma_device_type": 1 00:18:52.142 }, 00:18:52.142 { 00:18:52.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.142 "dma_device_type": 2 00:18:52.142 } 00:18:52.142 ], 00:18:52.142 "driver_specific": {} 00:18:52.142 } 00:18:52.142 ] 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.142 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.143 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.143 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.143 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.143 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.143 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:52.400 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.400 "name": "Existed_Raid", 00:18:52.400 "uuid": "d9964373-4499-4130-9036-2807097ebd95", 00:18:52.400 "strip_size_kb": 64, 00:18:52.400 "state": "online", 00:18:52.400 "raid_level": "raid0", 00:18:52.400 "superblock": false, 00:18:52.400 "num_base_bdevs": 4, 00:18:52.400 "num_base_bdevs_discovered": 4, 00:18:52.400 "num_base_bdevs_operational": 4, 00:18:52.400 "base_bdevs_list": [ 00:18:52.400 { 00:18:52.400 "name": "NewBaseBdev", 00:18:52.400 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:52.400 "is_configured": true, 00:18:52.400 "data_offset": 0, 00:18:52.400 "data_size": 65536 00:18:52.400 }, 00:18:52.400 { 00:18:52.400 "name": "BaseBdev2", 00:18:52.400 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:52.400 "is_configured": true, 00:18:52.400 "data_offset": 0, 00:18:52.400 "data_size": 65536 00:18:52.400 }, 00:18:52.400 { 00:18:52.400 "name": "BaseBdev3", 00:18:52.400 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:52.400 "is_configured": true, 00:18:52.400 "data_offset": 0, 00:18:52.400 "data_size": 65536 00:18:52.400 }, 00:18:52.400 { 00:18:52.400 "name": "BaseBdev4", 00:18:52.400 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:52.400 "is_configured": true, 00:18:52.400 "data_offset": 0, 00:18:52.400 "data_size": 65536 00:18:52.400 } 00:18:52.400 ] 00:18:52.400 }' 00:18:52.400 10:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.400 10:26:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:52.967 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:53.226 [2024-07-15 10:26:30.272280] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:53.226 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:53.226 "name": "Existed_Raid", 00:18:53.226 "aliases": [ 00:18:53.226 "d9964373-4499-4130-9036-2807097ebd95" 00:18:53.226 ], 00:18:53.226 "product_name": "Raid Volume", 00:18:53.226 "block_size": 512, 00:18:53.226 "num_blocks": 262144, 00:18:53.226 "uuid": "d9964373-4499-4130-9036-2807097ebd95", 00:18:53.226 "assigned_rate_limits": { 00:18:53.226 "rw_ios_per_sec": 0, 00:18:53.226 "rw_mbytes_per_sec": 0, 00:18:53.226 "r_mbytes_per_sec": 0, 00:18:53.226 "w_mbytes_per_sec": 0 00:18:53.226 }, 00:18:53.226 "claimed": false, 00:18:53.226 "zoned": false, 00:18:53.226 "supported_io_types": { 00:18:53.226 "read": true, 00:18:53.226 "write": true, 00:18:53.226 "unmap": true, 00:18:53.226 "flush": true, 00:18:53.226 "reset": true, 00:18:53.226 "nvme_admin": false, 00:18:53.226 "nvme_io": false, 00:18:53.226 "nvme_io_md": false, 00:18:53.226 "write_zeroes": true, 00:18:53.226 "zcopy": false, 00:18:53.226 "get_zone_info": false, 00:18:53.226 "zone_management": false, 00:18:53.226 "zone_append": false, 00:18:53.226 "compare": false, 00:18:53.226 "compare_and_write": false, 00:18:53.226 "abort": false, 00:18:53.226 "seek_hole": false, 00:18:53.226 "seek_data": false, 00:18:53.226 "copy": false, 00:18:53.226 "nvme_iov_md": false 00:18:53.226 }, 00:18:53.226 "memory_domains": [ 00:18:53.226 { 00:18:53.226 "dma_device_id": "system", 00:18:53.226 "dma_device_type": 1 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.226 "dma_device_type": 2 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "dma_device_id": "system", 00:18:53.226 "dma_device_type": 1 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.226 "dma_device_type": 2 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "dma_device_id": "system", 00:18:53.226 "dma_device_type": 1 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.226 "dma_device_type": 2 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "dma_device_id": "system", 00:18:53.226 "dma_device_type": 1 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.226 "dma_device_type": 2 00:18:53.226 } 00:18:53.226 ], 00:18:53.226 "driver_specific": { 00:18:53.226 "raid": { 00:18:53.226 "uuid": "d9964373-4499-4130-9036-2807097ebd95", 00:18:53.226 "strip_size_kb": 64, 00:18:53.226 "state": "online", 00:18:53.226 "raid_level": "raid0", 00:18:53.226 "superblock": false, 00:18:53.226 "num_base_bdevs": 4, 00:18:53.226 "num_base_bdevs_discovered": 4, 00:18:53.226 "num_base_bdevs_operational": 4, 00:18:53.226 "base_bdevs_list": [ 00:18:53.226 { 00:18:53.226 "name": "NewBaseBdev", 00:18:53.226 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:53.226 "is_configured": true, 00:18:53.226 "data_offset": 0, 00:18:53.226 "data_size": 65536 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "name": "BaseBdev2", 00:18:53.226 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:53.226 "is_configured": true, 00:18:53.226 "data_offset": 0, 00:18:53.226 "data_size": 65536 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "name": "BaseBdev3", 00:18:53.226 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:53.226 "is_configured": true, 00:18:53.226 "data_offset": 0, 00:18:53.226 "data_size": 65536 00:18:53.226 }, 00:18:53.226 { 00:18:53.226 "name": "BaseBdev4", 00:18:53.226 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:53.226 "is_configured": true, 00:18:53.226 "data_offset": 0, 00:18:53.226 "data_size": 65536 00:18:53.226 } 00:18:53.226 ] 00:18:53.226 } 00:18:53.226 } 00:18:53.226 }' 00:18:53.226 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:53.226 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:53.226 BaseBdev2 00:18:53.226 BaseBdev3 00:18:53.226 BaseBdev4' 00:18:53.226 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.226 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:53.226 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.512 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.512 "name": "NewBaseBdev", 00:18:53.512 "aliases": [ 00:18:53.512 "f792e8cd-aa61-411c-88d5-3c9f4bb47a31" 00:18:53.512 ], 00:18:53.512 "product_name": "Malloc disk", 00:18:53.512 "block_size": 512, 00:18:53.512 "num_blocks": 65536, 00:18:53.512 "uuid": "f792e8cd-aa61-411c-88d5-3c9f4bb47a31", 00:18:53.512 "assigned_rate_limits": { 00:18:53.512 "rw_ios_per_sec": 0, 00:18:53.512 "rw_mbytes_per_sec": 0, 00:18:53.512 "r_mbytes_per_sec": 0, 00:18:53.512 "w_mbytes_per_sec": 0 00:18:53.512 }, 00:18:53.512 "claimed": true, 00:18:53.512 "claim_type": "exclusive_write", 00:18:53.512 "zoned": false, 00:18:53.512 "supported_io_types": { 00:18:53.512 "read": true, 00:18:53.512 "write": true, 00:18:53.512 "unmap": true, 00:18:53.512 "flush": true, 00:18:53.512 "reset": true, 00:18:53.512 "nvme_admin": false, 00:18:53.512 "nvme_io": false, 00:18:53.512 "nvme_io_md": false, 00:18:53.512 "write_zeroes": true, 00:18:53.512 "zcopy": true, 00:18:53.512 "get_zone_info": false, 00:18:53.512 "zone_management": false, 00:18:53.512 "zone_append": false, 00:18:53.512 "compare": false, 00:18:53.512 "compare_and_write": false, 00:18:53.512 "abort": true, 00:18:53.512 "seek_hole": false, 00:18:53.512 "seek_data": false, 00:18:53.512 "copy": true, 00:18:53.512 "nvme_iov_md": false 00:18:53.512 }, 00:18:53.512 "memory_domains": [ 00:18:53.512 { 00:18:53.512 "dma_device_id": "system", 00:18:53.512 "dma_device_type": 1 00:18:53.512 }, 00:18:53.512 { 00:18:53.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.512 "dma_device_type": 2 00:18:53.512 } 00:18:53.512 ], 00:18:53.512 "driver_specific": {} 00:18:53.512 }' 00:18:53.512 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.512 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.512 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.512 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.771 10:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:54.070 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.070 "name": "BaseBdev2", 00:18:54.070 "aliases": [ 00:18:54.070 "0164acc0-0e24-4f27-8fdb-ba265e17b431" 00:18:54.070 ], 00:18:54.070 "product_name": "Malloc disk", 00:18:54.070 "block_size": 512, 00:18:54.070 "num_blocks": 65536, 00:18:54.070 "uuid": "0164acc0-0e24-4f27-8fdb-ba265e17b431", 00:18:54.070 "assigned_rate_limits": { 00:18:54.070 "rw_ios_per_sec": 0, 00:18:54.070 "rw_mbytes_per_sec": 0, 00:18:54.070 "r_mbytes_per_sec": 0, 00:18:54.070 "w_mbytes_per_sec": 0 00:18:54.070 }, 00:18:54.070 "claimed": true, 00:18:54.070 "claim_type": "exclusive_write", 00:18:54.070 "zoned": false, 00:18:54.070 "supported_io_types": { 00:18:54.070 "read": true, 00:18:54.070 "write": true, 00:18:54.070 "unmap": true, 00:18:54.070 "flush": true, 00:18:54.070 "reset": true, 00:18:54.070 "nvme_admin": false, 00:18:54.070 "nvme_io": false, 00:18:54.070 "nvme_io_md": false, 00:18:54.070 "write_zeroes": true, 00:18:54.070 "zcopy": true, 00:18:54.070 "get_zone_info": false, 00:18:54.070 "zone_management": false, 00:18:54.070 "zone_append": false, 00:18:54.070 "compare": false, 00:18:54.070 "compare_and_write": false, 00:18:54.070 "abort": true, 00:18:54.070 "seek_hole": false, 00:18:54.070 "seek_data": false, 00:18:54.070 "copy": true, 00:18:54.070 "nvme_iov_md": false 00:18:54.070 }, 00:18:54.070 "memory_domains": [ 00:18:54.070 { 00:18:54.070 "dma_device_id": "system", 00:18:54.070 "dma_device_type": 1 00:18:54.070 }, 00:18:54.070 { 00:18:54.070 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.070 "dma_device_type": 2 00:18:54.070 } 00:18:54.070 ], 00:18:54.070 "driver_specific": {} 00:18:54.070 }' 00:18:54.070 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.070 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:54.353 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.612 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.612 "name": "BaseBdev3", 00:18:54.612 "aliases": [ 00:18:54.612 "074241e3-a670-4052-9fce-ec28e3262e50" 00:18:54.612 ], 00:18:54.612 "product_name": "Malloc disk", 00:18:54.612 "block_size": 512, 00:18:54.612 "num_blocks": 65536, 00:18:54.612 "uuid": "074241e3-a670-4052-9fce-ec28e3262e50", 00:18:54.612 "assigned_rate_limits": { 00:18:54.612 "rw_ios_per_sec": 0, 00:18:54.612 "rw_mbytes_per_sec": 0, 00:18:54.612 "r_mbytes_per_sec": 0, 00:18:54.612 "w_mbytes_per_sec": 0 00:18:54.612 }, 00:18:54.612 "claimed": true, 00:18:54.612 "claim_type": "exclusive_write", 00:18:54.612 "zoned": false, 00:18:54.612 "supported_io_types": { 00:18:54.612 "read": true, 00:18:54.612 "write": true, 00:18:54.612 "unmap": true, 00:18:54.612 "flush": true, 00:18:54.612 "reset": true, 00:18:54.612 "nvme_admin": false, 00:18:54.612 "nvme_io": false, 00:18:54.612 "nvme_io_md": false, 00:18:54.612 "write_zeroes": true, 00:18:54.612 "zcopy": true, 00:18:54.612 "get_zone_info": false, 00:18:54.612 "zone_management": false, 00:18:54.612 "zone_append": false, 00:18:54.612 "compare": false, 00:18:54.612 "compare_and_write": false, 00:18:54.612 "abort": true, 00:18:54.612 "seek_hole": false, 00:18:54.612 "seek_data": false, 00:18:54.612 "copy": true, 00:18:54.612 "nvme_iov_md": false 00:18:54.612 }, 00:18:54.612 "memory_domains": [ 00:18:54.612 { 00:18:54.612 "dma_device_id": "system", 00:18:54.612 "dma_device_type": 1 00:18:54.612 }, 00:18:54.612 { 00:18:54.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.612 "dma_device_type": 2 00:18:54.612 } 00:18:54.612 ], 00:18:54.612 "driver_specific": {} 00:18:54.612 }' 00:18:54.612 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.612 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.870 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.870 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.870 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.870 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.870 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.870 10:26:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.870 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.870 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.129 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.129 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.129 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.129 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:55.129 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.387 "name": "BaseBdev4", 00:18:55.387 "aliases": [ 00:18:55.387 "224cee66-ca02-4880-9a71-e3c61437c695" 00:18:55.387 ], 00:18:55.387 "product_name": "Malloc disk", 00:18:55.387 "block_size": 512, 00:18:55.387 "num_blocks": 65536, 00:18:55.387 "uuid": "224cee66-ca02-4880-9a71-e3c61437c695", 00:18:55.387 "assigned_rate_limits": { 00:18:55.387 "rw_ios_per_sec": 0, 00:18:55.387 "rw_mbytes_per_sec": 0, 00:18:55.387 "r_mbytes_per_sec": 0, 00:18:55.387 "w_mbytes_per_sec": 0 00:18:55.387 }, 00:18:55.387 "claimed": true, 00:18:55.387 "claim_type": "exclusive_write", 00:18:55.387 "zoned": false, 00:18:55.387 "supported_io_types": { 00:18:55.387 "read": true, 00:18:55.387 "write": true, 00:18:55.387 "unmap": true, 00:18:55.387 "flush": true, 00:18:55.387 "reset": true, 00:18:55.387 "nvme_admin": false, 00:18:55.387 "nvme_io": false, 00:18:55.387 "nvme_io_md": false, 00:18:55.387 "write_zeroes": true, 00:18:55.387 "zcopy": true, 00:18:55.387 "get_zone_info": false, 00:18:55.387 "zone_management": false, 00:18:55.387 "zone_append": false, 00:18:55.387 "compare": false, 00:18:55.387 "compare_and_write": false, 00:18:55.387 "abort": true, 00:18:55.387 "seek_hole": false, 00:18:55.387 "seek_data": false, 00:18:55.387 "copy": true, 00:18:55.387 "nvme_iov_md": false 00:18:55.387 }, 00:18:55.387 "memory_domains": [ 00:18:55.387 { 00:18:55.387 "dma_device_id": "system", 00:18:55.387 "dma_device_type": 1 00:18:55.387 }, 00:18:55.387 { 00:18:55.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.387 "dma_device_type": 2 00:18:55.387 } 00:18:55.387 ], 00:18:55.387 "driver_specific": {} 00:18:55.387 }' 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.387 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.646 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.646 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.646 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.646 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.646 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:55.904 [2024-07-15 10:26:32.906922] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:55.904 [2024-07-15 10:26:32.906954] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:55.904 [2024-07-15 10:26:32.907007] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:55.904 [2024-07-15 10:26:32.907068] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:55.904 [2024-07-15 10:26:32.907081] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa70040 name Existed_Raid, state offline 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 535316 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 535316 ']' 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 535316 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 535316 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 535316' 00:18:55.904 killing process with pid 535316 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 535316 00:18:55.904 [2024-07-15 10:26:32.976181] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:55.904 10:26:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 535316 00:18:55.904 [2024-07-15 10:26:33.017770] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:56.166 10:26:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:56.166 00:18:56.166 real 0m32.471s 00:18:56.166 user 1m0.025s 00:18:56.166 sys 0m5.820s 00:18:56.166 10:26:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:56.166 10:26:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.166 ************************************ 00:18:56.166 END TEST raid_state_function_test 00:18:56.166 ************************************ 00:18:56.166 10:26:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:56.166 10:26:33 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:56.166 10:26:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:56.166 10:26:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:56.166 10:26:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:56.166 ************************************ 00:18:56.166 START TEST raid_state_function_test_sb 00:18:56.166 ************************************ 00:18:56.166 10:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:56.166 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=540208 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 540208' 00:18:56.167 Process raid pid: 540208 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 540208 /var/tmp/spdk-raid.sock 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 540208 ']' 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:56.167 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:56.167 10:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:56.426 [2024-07-15 10:26:33.381794] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:18:56.426 [2024-07-15 10:26:33.381861] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:56.426 [2024-07-15 10:26:33.513144] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:56.426 [2024-07-15 10:26:33.618828] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:56.683 [2024-07-15 10:26:33.683409] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:56.683 [2024-07-15 10:26:33.683444] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.250 10:26:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:57.250 10:26:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:57.250 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:57.509 [2024-07-15 10:26:34.544097] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:57.509 [2024-07-15 10:26:34.544144] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:57.509 [2024-07-15 10:26:34.544155] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:57.509 [2024-07-15 10:26:34.544167] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:57.509 [2024-07-15 10:26:34.544176] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:57.509 [2024-07-15 10:26:34.544187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:57.509 [2024-07-15 10:26:34.544196] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:57.509 [2024-07-15 10:26:34.544207] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.509 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:57.768 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.768 "name": "Existed_Raid", 00:18:57.768 "uuid": "fcad5bf9-1496-4b85-b348-e5ffb485dc9a", 00:18:57.768 "strip_size_kb": 64, 00:18:57.768 "state": "configuring", 00:18:57.768 "raid_level": "raid0", 00:18:57.768 "superblock": true, 00:18:57.768 "num_base_bdevs": 4, 00:18:57.768 "num_base_bdevs_discovered": 0, 00:18:57.768 "num_base_bdevs_operational": 4, 00:18:57.768 "base_bdevs_list": [ 00:18:57.768 { 00:18:57.768 "name": "BaseBdev1", 00:18:57.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.768 "is_configured": false, 00:18:57.768 "data_offset": 0, 00:18:57.768 "data_size": 0 00:18:57.768 }, 00:18:57.768 { 00:18:57.768 "name": "BaseBdev2", 00:18:57.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.768 "is_configured": false, 00:18:57.768 "data_offset": 0, 00:18:57.768 "data_size": 0 00:18:57.768 }, 00:18:57.768 { 00:18:57.768 "name": "BaseBdev3", 00:18:57.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.768 "is_configured": false, 00:18:57.768 "data_offset": 0, 00:18:57.768 "data_size": 0 00:18:57.768 }, 00:18:57.768 { 00:18:57.768 "name": "BaseBdev4", 00:18:57.768 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.768 "is_configured": false, 00:18:57.768 "data_offset": 0, 00:18:57.768 "data_size": 0 00:18:57.768 } 00:18:57.768 ] 00:18:57.768 }' 00:18:57.768 10:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.768 10:26:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:58.334 10:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:58.593 [2024-07-15 10:26:35.618789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:58.593 [2024-07-15 10:26:35.618819] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22abaa0 name Existed_Raid, state configuring 00:18:58.593 10:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:58.853 [2024-07-15 10:26:35.863464] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:58.853 [2024-07-15 10:26:35.863497] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:58.853 [2024-07-15 10:26:35.863507] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:58.853 [2024-07-15 10:26:35.863518] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:58.853 [2024-07-15 10:26:35.863527] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:58.853 [2024-07-15 10:26:35.863538] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:58.853 [2024-07-15 10:26:35.863546] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:58.853 [2024-07-15 10:26:35.863557] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:58.853 10:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:59.112 [2024-07-15 10:26:36.107231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:59.112 BaseBdev1 00:18:59.112 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:59.112 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:59.112 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:59.112 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:59.112 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:59.112 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:59.112 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:59.371 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:59.630 [ 00:18:59.630 { 00:18:59.630 "name": "BaseBdev1", 00:18:59.630 "aliases": [ 00:18:59.630 "8d254049-7280-4ed0-8699-3f585702abe7" 00:18:59.630 ], 00:18:59.630 "product_name": "Malloc disk", 00:18:59.630 "block_size": 512, 00:18:59.630 "num_blocks": 65536, 00:18:59.630 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:18:59.630 "assigned_rate_limits": { 00:18:59.630 "rw_ios_per_sec": 0, 00:18:59.630 "rw_mbytes_per_sec": 0, 00:18:59.630 "r_mbytes_per_sec": 0, 00:18:59.630 "w_mbytes_per_sec": 0 00:18:59.630 }, 00:18:59.630 "claimed": true, 00:18:59.630 "claim_type": "exclusive_write", 00:18:59.630 "zoned": false, 00:18:59.630 "supported_io_types": { 00:18:59.630 "read": true, 00:18:59.630 "write": true, 00:18:59.630 "unmap": true, 00:18:59.630 "flush": true, 00:18:59.630 "reset": true, 00:18:59.630 "nvme_admin": false, 00:18:59.630 "nvme_io": false, 00:18:59.630 "nvme_io_md": false, 00:18:59.630 "write_zeroes": true, 00:18:59.630 "zcopy": true, 00:18:59.630 "get_zone_info": false, 00:18:59.630 "zone_management": false, 00:18:59.630 "zone_append": false, 00:18:59.630 "compare": false, 00:18:59.630 "compare_and_write": false, 00:18:59.630 "abort": true, 00:18:59.630 "seek_hole": false, 00:18:59.630 "seek_data": false, 00:18:59.630 "copy": true, 00:18:59.630 "nvme_iov_md": false 00:18:59.630 }, 00:18:59.630 "memory_domains": [ 00:18:59.630 { 00:18:59.630 "dma_device_id": "system", 00:18:59.630 "dma_device_type": 1 00:18:59.630 }, 00:18:59.630 { 00:18:59.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:59.630 "dma_device_type": 2 00:18:59.630 } 00:18:59.630 ], 00:18:59.630 "driver_specific": {} 00:18:59.630 } 00:18:59.630 ] 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.630 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.890 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.890 "name": "Existed_Raid", 00:18:59.890 "uuid": "97a1ff43-b8d8-496e-a71f-9841ae59df85", 00:18:59.890 "strip_size_kb": 64, 00:18:59.890 "state": "configuring", 00:18:59.890 "raid_level": "raid0", 00:18:59.890 "superblock": true, 00:18:59.890 "num_base_bdevs": 4, 00:18:59.890 "num_base_bdevs_discovered": 1, 00:18:59.890 "num_base_bdevs_operational": 4, 00:18:59.890 "base_bdevs_list": [ 00:18:59.890 { 00:18:59.890 "name": "BaseBdev1", 00:18:59.890 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:18:59.890 "is_configured": true, 00:18:59.890 "data_offset": 2048, 00:18:59.890 "data_size": 63488 00:18:59.890 }, 00:18:59.890 { 00:18:59.890 "name": "BaseBdev2", 00:18:59.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.890 "is_configured": false, 00:18:59.890 "data_offset": 0, 00:18:59.890 "data_size": 0 00:18:59.890 }, 00:18:59.890 { 00:18:59.890 "name": "BaseBdev3", 00:18:59.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.890 "is_configured": false, 00:18:59.890 "data_offset": 0, 00:18:59.890 "data_size": 0 00:18:59.890 }, 00:18:59.890 { 00:18:59.890 "name": "BaseBdev4", 00:18:59.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:59.890 "is_configured": false, 00:18:59.890 "data_offset": 0, 00:18:59.890 "data_size": 0 00:18:59.890 } 00:18:59.890 ] 00:18:59.890 }' 00:18:59.890 10:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.891 10:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:00.458 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:00.458 [2024-07-15 10:26:37.611212] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:00.458 [2024-07-15 10:26:37.611251] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22ab310 name Existed_Raid, state configuring 00:19:00.458 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:00.717 [2024-07-15 10:26:37.787731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:00.717 [2024-07-15 10:26:37.789165] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:00.717 [2024-07-15 10:26:37.789197] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:00.717 [2024-07-15 10:26:37.789207] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:00.717 [2024-07-15 10:26:37.789220] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:00.717 [2024-07-15 10:26:37.789229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:00.717 [2024-07-15 10:26:37.789240] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.717 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.718 10:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:00.977 10:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:00.977 "name": "Existed_Raid", 00:19:00.977 "uuid": "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b", 00:19:00.977 "strip_size_kb": 64, 00:19:00.977 "state": "configuring", 00:19:00.977 "raid_level": "raid0", 00:19:00.977 "superblock": true, 00:19:00.977 "num_base_bdevs": 4, 00:19:00.977 "num_base_bdevs_discovered": 1, 00:19:00.977 "num_base_bdevs_operational": 4, 00:19:00.977 "base_bdevs_list": [ 00:19:00.977 { 00:19:00.977 "name": "BaseBdev1", 00:19:00.977 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:19:00.977 "is_configured": true, 00:19:00.977 "data_offset": 2048, 00:19:00.977 "data_size": 63488 00:19:00.977 }, 00:19:00.977 { 00:19:00.977 "name": "BaseBdev2", 00:19:00.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.977 "is_configured": false, 00:19:00.977 "data_offset": 0, 00:19:00.977 "data_size": 0 00:19:00.977 }, 00:19:00.977 { 00:19:00.977 "name": "BaseBdev3", 00:19:00.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.977 "is_configured": false, 00:19:00.977 "data_offset": 0, 00:19:00.977 "data_size": 0 00:19:00.977 }, 00:19:00.977 { 00:19:00.977 "name": "BaseBdev4", 00:19:00.977 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:00.977 "is_configured": false, 00:19:00.977 "data_offset": 0, 00:19:00.977 "data_size": 0 00:19:00.977 } 00:19:00.977 ] 00:19:00.977 }' 00:19:00.977 10:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:00.977 10:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:01.546 10:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:01.806 [2024-07-15 10:26:38.894020] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:01.806 BaseBdev2 00:19:01.806 10:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:01.806 10:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:01.806 10:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:01.806 10:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:01.806 10:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:01.806 10:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:01.806 10:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:02.065 10:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:02.324 [ 00:19:02.324 { 00:19:02.324 "name": "BaseBdev2", 00:19:02.324 "aliases": [ 00:19:02.324 "7c9b0de8-68da-4940-abe2-7a86a64f0bf1" 00:19:02.324 ], 00:19:02.324 "product_name": "Malloc disk", 00:19:02.324 "block_size": 512, 00:19:02.324 "num_blocks": 65536, 00:19:02.324 "uuid": "7c9b0de8-68da-4940-abe2-7a86a64f0bf1", 00:19:02.324 "assigned_rate_limits": { 00:19:02.324 "rw_ios_per_sec": 0, 00:19:02.324 "rw_mbytes_per_sec": 0, 00:19:02.324 "r_mbytes_per_sec": 0, 00:19:02.324 "w_mbytes_per_sec": 0 00:19:02.324 }, 00:19:02.324 "claimed": true, 00:19:02.324 "claim_type": "exclusive_write", 00:19:02.324 "zoned": false, 00:19:02.324 "supported_io_types": { 00:19:02.324 "read": true, 00:19:02.324 "write": true, 00:19:02.324 "unmap": true, 00:19:02.324 "flush": true, 00:19:02.324 "reset": true, 00:19:02.324 "nvme_admin": false, 00:19:02.324 "nvme_io": false, 00:19:02.324 "nvme_io_md": false, 00:19:02.324 "write_zeroes": true, 00:19:02.324 "zcopy": true, 00:19:02.324 "get_zone_info": false, 00:19:02.324 "zone_management": false, 00:19:02.324 "zone_append": false, 00:19:02.324 "compare": false, 00:19:02.324 "compare_and_write": false, 00:19:02.324 "abort": true, 00:19:02.324 "seek_hole": false, 00:19:02.324 "seek_data": false, 00:19:02.324 "copy": true, 00:19:02.324 "nvme_iov_md": false 00:19:02.324 }, 00:19:02.324 "memory_domains": [ 00:19:02.324 { 00:19:02.324 "dma_device_id": "system", 00:19:02.324 "dma_device_type": 1 00:19:02.324 }, 00:19:02.324 { 00:19:02.324 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.324 "dma_device_type": 2 00:19:02.324 } 00:19:02.324 ], 00:19:02.324 "driver_specific": {} 00:19:02.324 } 00:19:02.324 ] 00:19:02.324 10:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:02.324 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:02.324 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:02.324 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:02.324 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:02.324 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:02.324 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.325 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:02.584 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.584 "name": "Existed_Raid", 00:19:02.584 "uuid": "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b", 00:19:02.584 "strip_size_kb": 64, 00:19:02.584 "state": "configuring", 00:19:02.584 "raid_level": "raid0", 00:19:02.584 "superblock": true, 00:19:02.584 "num_base_bdevs": 4, 00:19:02.584 "num_base_bdevs_discovered": 2, 00:19:02.584 "num_base_bdevs_operational": 4, 00:19:02.584 "base_bdevs_list": [ 00:19:02.584 { 00:19:02.584 "name": "BaseBdev1", 00:19:02.584 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:19:02.584 "is_configured": true, 00:19:02.584 "data_offset": 2048, 00:19:02.584 "data_size": 63488 00:19:02.584 }, 00:19:02.584 { 00:19:02.584 "name": "BaseBdev2", 00:19:02.584 "uuid": "7c9b0de8-68da-4940-abe2-7a86a64f0bf1", 00:19:02.584 "is_configured": true, 00:19:02.584 "data_offset": 2048, 00:19:02.584 "data_size": 63488 00:19:02.584 }, 00:19:02.584 { 00:19:02.584 "name": "BaseBdev3", 00:19:02.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.584 "is_configured": false, 00:19:02.584 "data_offset": 0, 00:19:02.584 "data_size": 0 00:19:02.584 }, 00:19:02.584 { 00:19:02.584 "name": "BaseBdev4", 00:19:02.584 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.584 "is_configured": false, 00:19:02.584 "data_offset": 0, 00:19:02.584 "data_size": 0 00:19:02.584 } 00:19:02.584 ] 00:19:02.584 }' 00:19:02.584 10:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.584 10:26:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:03.167 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:03.426 [2024-07-15 10:26:40.485887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:03.426 BaseBdev3 00:19:03.426 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:03.426 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:03.426 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:03.426 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:03.426 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:03.426 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:03.426 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:03.684 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:03.943 [ 00:19:03.943 { 00:19:03.943 "name": "BaseBdev3", 00:19:03.943 "aliases": [ 00:19:03.943 "d9b1374f-3927-46f8-a45d-5988492914c2" 00:19:03.943 ], 00:19:03.943 "product_name": "Malloc disk", 00:19:03.943 "block_size": 512, 00:19:03.943 "num_blocks": 65536, 00:19:03.943 "uuid": "d9b1374f-3927-46f8-a45d-5988492914c2", 00:19:03.943 "assigned_rate_limits": { 00:19:03.943 "rw_ios_per_sec": 0, 00:19:03.943 "rw_mbytes_per_sec": 0, 00:19:03.943 "r_mbytes_per_sec": 0, 00:19:03.944 "w_mbytes_per_sec": 0 00:19:03.944 }, 00:19:03.944 "claimed": true, 00:19:03.944 "claim_type": "exclusive_write", 00:19:03.944 "zoned": false, 00:19:03.944 "supported_io_types": { 00:19:03.944 "read": true, 00:19:03.944 "write": true, 00:19:03.944 "unmap": true, 00:19:03.944 "flush": true, 00:19:03.944 "reset": true, 00:19:03.944 "nvme_admin": false, 00:19:03.944 "nvme_io": false, 00:19:03.944 "nvme_io_md": false, 00:19:03.944 "write_zeroes": true, 00:19:03.944 "zcopy": true, 00:19:03.944 "get_zone_info": false, 00:19:03.944 "zone_management": false, 00:19:03.944 "zone_append": false, 00:19:03.944 "compare": false, 00:19:03.944 "compare_and_write": false, 00:19:03.944 "abort": true, 00:19:03.944 "seek_hole": false, 00:19:03.944 "seek_data": false, 00:19:03.944 "copy": true, 00:19:03.944 "nvme_iov_md": false 00:19:03.944 }, 00:19:03.944 "memory_domains": [ 00:19:03.944 { 00:19:03.944 "dma_device_id": "system", 00:19:03.944 "dma_device_type": 1 00:19:03.944 }, 00:19:03.944 { 00:19:03.944 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.944 "dma_device_type": 2 00:19:03.944 } 00:19:03.944 ], 00:19:03.944 "driver_specific": {} 00:19:03.944 } 00:19:03.944 ] 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.944 10:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:04.203 10:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.203 "name": "Existed_Raid", 00:19:04.203 "uuid": "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b", 00:19:04.203 "strip_size_kb": 64, 00:19:04.203 "state": "configuring", 00:19:04.203 "raid_level": "raid0", 00:19:04.203 "superblock": true, 00:19:04.203 "num_base_bdevs": 4, 00:19:04.203 "num_base_bdevs_discovered": 3, 00:19:04.203 "num_base_bdevs_operational": 4, 00:19:04.203 "base_bdevs_list": [ 00:19:04.203 { 00:19:04.203 "name": "BaseBdev1", 00:19:04.203 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:19:04.203 "is_configured": true, 00:19:04.203 "data_offset": 2048, 00:19:04.203 "data_size": 63488 00:19:04.203 }, 00:19:04.203 { 00:19:04.203 "name": "BaseBdev2", 00:19:04.203 "uuid": "7c9b0de8-68da-4940-abe2-7a86a64f0bf1", 00:19:04.203 "is_configured": true, 00:19:04.203 "data_offset": 2048, 00:19:04.203 "data_size": 63488 00:19:04.203 }, 00:19:04.203 { 00:19:04.203 "name": "BaseBdev3", 00:19:04.203 "uuid": "d9b1374f-3927-46f8-a45d-5988492914c2", 00:19:04.203 "is_configured": true, 00:19:04.203 "data_offset": 2048, 00:19:04.203 "data_size": 63488 00:19:04.203 }, 00:19:04.203 { 00:19:04.203 "name": "BaseBdev4", 00:19:04.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:04.203 "is_configured": false, 00:19:04.203 "data_offset": 0, 00:19:04.203 "data_size": 0 00:19:04.203 } 00:19:04.203 ] 00:19:04.203 }' 00:19:04.203 10:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.203 10:26:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:04.772 10:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:05.032 [2024-07-15 10:26:42.049417] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:05.032 [2024-07-15 10:26:42.049589] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22ac350 00:19:05.032 [2024-07-15 10:26:42.049602] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:05.032 [2024-07-15 10:26:42.049774] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22ac020 00:19:05.032 [2024-07-15 10:26:42.049892] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22ac350 00:19:05.032 [2024-07-15 10:26:42.049902] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22ac350 00:19:05.032 [2024-07-15 10:26:42.050001] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:05.032 BaseBdev4 00:19:05.032 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:05.032 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:05.032 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:05.032 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:05.032 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:05.032 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:05.032 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:05.291 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:05.550 [ 00:19:05.550 { 00:19:05.550 "name": "BaseBdev4", 00:19:05.550 "aliases": [ 00:19:05.550 "7bfff2dd-a99b-4000-91de-b56cd44a8026" 00:19:05.550 ], 00:19:05.550 "product_name": "Malloc disk", 00:19:05.550 "block_size": 512, 00:19:05.550 "num_blocks": 65536, 00:19:05.550 "uuid": "7bfff2dd-a99b-4000-91de-b56cd44a8026", 00:19:05.550 "assigned_rate_limits": { 00:19:05.550 "rw_ios_per_sec": 0, 00:19:05.550 "rw_mbytes_per_sec": 0, 00:19:05.550 "r_mbytes_per_sec": 0, 00:19:05.550 "w_mbytes_per_sec": 0 00:19:05.550 }, 00:19:05.550 "claimed": true, 00:19:05.550 "claim_type": "exclusive_write", 00:19:05.550 "zoned": false, 00:19:05.550 "supported_io_types": { 00:19:05.550 "read": true, 00:19:05.550 "write": true, 00:19:05.550 "unmap": true, 00:19:05.550 "flush": true, 00:19:05.550 "reset": true, 00:19:05.550 "nvme_admin": false, 00:19:05.550 "nvme_io": false, 00:19:05.550 "nvme_io_md": false, 00:19:05.550 "write_zeroes": true, 00:19:05.550 "zcopy": true, 00:19:05.550 "get_zone_info": false, 00:19:05.550 "zone_management": false, 00:19:05.550 "zone_append": false, 00:19:05.550 "compare": false, 00:19:05.551 "compare_and_write": false, 00:19:05.551 "abort": true, 00:19:05.551 "seek_hole": false, 00:19:05.551 "seek_data": false, 00:19:05.551 "copy": true, 00:19:05.551 "nvme_iov_md": false 00:19:05.551 }, 00:19:05.551 "memory_domains": [ 00:19:05.551 { 00:19:05.551 "dma_device_id": "system", 00:19:05.551 "dma_device_type": 1 00:19:05.551 }, 00:19:05.551 { 00:19:05.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:05.551 "dma_device_type": 2 00:19:05.551 } 00:19:05.551 ], 00:19:05.551 "driver_specific": {} 00:19:05.551 } 00:19:05.551 ] 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.551 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:05.810 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.810 "name": "Existed_Raid", 00:19:05.810 "uuid": "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b", 00:19:05.810 "strip_size_kb": 64, 00:19:05.810 "state": "online", 00:19:05.810 "raid_level": "raid0", 00:19:05.810 "superblock": true, 00:19:05.810 "num_base_bdevs": 4, 00:19:05.810 "num_base_bdevs_discovered": 4, 00:19:05.810 "num_base_bdevs_operational": 4, 00:19:05.810 "base_bdevs_list": [ 00:19:05.810 { 00:19:05.810 "name": "BaseBdev1", 00:19:05.810 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:19:05.810 "is_configured": true, 00:19:05.810 "data_offset": 2048, 00:19:05.810 "data_size": 63488 00:19:05.810 }, 00:19:05.810 { 00:19:05.810 "name": "BaseBdev2", 00:19:05.810 "uuid": "7c9b0de8-68da-4940-abe2-7a86a64f0bf1", 00:19:05.810 "is_configured": true, 00:19:05.810 "data_offset": 2048, 00:19:05.810 "data_size": 63488 00:19:05.810 }, 00:19:05.810 { 00:19:05.810 "name": "BaseBdev3", 00:19:05.810 "uuid": "d9b1374f-3927-46f8-a45d-5988492914c2", 00:19:05.810 "is_configured": true, 00:19:05.810 "data_offset": 2048, 00:19:05.810 "data_size": 63488 00:19:05.810 }, 00:19:05.810 { 00:19:05.810 "name": "BaseBdev4", 00:19:05.810 "uuid": "7bfff2dd-a99b-4000-91de-b56cd44a8026", 00:19:05.810 "is_configured": true, 00:19:05.810 "data_offset": 2048, 00:19:05.810 "data_size": 63488 00:19:05.810 } 00:19:05.810 ] 00:19:05.810 }' 00:19:05.810 10:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.810 10:26:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:06.376 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:06.635 [2024-07-15 10:26:43.621945] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:06.635 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:06.635 "name": "Existed_Raid", 00:19:06.635 "aliases": [ 00:19:06.635 "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b" 00:19:06.635 ], 00:19:06.635 "product_name": "Raid Volume", 00:19:06.635 "block_size": 512, 00:19:06.635 "num_blocks": 253952, 00:19:06.635 "uuid": "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b", 00:19:06.635 "assigned_rate_limits": { 00:19:06.635 "rw_ios_per_sec": 0, 00:19:06.635 "rw_mbytes_per_sec": 0, 00:19:06.635 "r_mbytes_per_sec": 0, 00:19:06.635 "w_mbytes_per_sec": 0 00:19:06.635 }, 00:19:06.635 "claimed": false, 00:19:06.635 "zoned": false, 00:19:06.635 "supported_io_types": { 00:19:06.635 "read": true, 00:19:06.635 "write": true, 00:19:06.635 "unmap": true, 00:19:06.635 "flush": true, 00:19:06.635 "reset": true, 00:19:06.635 "nvme_admin": false, 00:19:06.635 "nvme_io": false, 00:19:06.635 "nvme_io_md": false, 00:19:06.635 "write_zeroes": true, 00:19:06.635 "zcopy": false, 00:19:06.635 "get_zone_info": false, 00:19:06.635 "zone_management": false, 00:19:06.635 "zone_append": false, 00:19:06.635 "compare": false, 00:19:06.635 "compare_and_write": false, 00:19:06.635 "abort": false, 00:19:06.635 "seek_hole": false, 00:19:06.635 "seek_data": false, 00:19:06.635 "copy": false, 00:19:06.635 "nvme_iov_md": false 00:19:06.635 }, 00:19:06.635 "memory_domains": [ 00:19:06.635 { 00:19:06.635 "dma_device_id": "system", 00:19:06.635 "dma_device_type": 1 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.635 "dma_device_type": 2 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "dma_device_id": "system", 00:19:06.635 "dma_device_type": 1 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.635 "dma_device_type": 2 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "dma_device_id": "system", 00:19:06.635 "dma_device_type": 1 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.635 "dma_device_type": 2 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "dma_device_id": "system", 00:19:06.635 "dma_device_type": 1 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.635 "dma_device_type": 2 00:19:06.635 } 00:19:06.635 ], 00:19:06.635 "driver_specific": { 00:19:06.635 "raid": { 00:19:06.635 "uuid": "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b", 00:19:06.635 "strip_size_kb": 64, 00:19:06.635 "state": "online", 00:19:06.635 "raid_level": "raid0", 00:19:06.635 "superblock": true, 00:19:06.635 "num_base_bdevs": 4, 00:19:06.635 "num_base_bdevs_discovered": 4, 00:19:06.635 "num_base_bdevs_operational": 4, 00:19:06.635 "base_bdevs_list": [ 00:19:06.635 { 00:19:06.635 "name": "BaseBdev1", 00:19:06.635 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:19:06.635 "is_configured": true, 00:19:06.635 "data_offset": 2048, 00:19:06.635 "data_size": 63488 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "name": "BaseBdev2", 00:19:06.635 "uuid": "7c9b0de8-68da-4940-abe2-7a86a64f0bf1", 00:19:06.635 "is_configured": true, 00:19:06.635 "data_offset": 2048, 00:19:06.635 "data_size": 63488 00:19:06.635 }, 00:19:06.635 { 00:19:06.635 "name": "BaseBdev3", 00:19:06.635 "uuid": "d9b1374f-3927-46f8-a45d-5988492914c2", 00:19:06.635 "is_configured": true, 00:19:06.635 "data_offset": 2048, 00:19:06.635 "data_size": 63488 00:19:06.635 }, 00:19:06.636 { 00:19:06.636 "name": "BaseBdev4", 00:19:06.636 "uuid": "7bfff2dd-a99b-4000-91de-b56cd44a8026", 00:19:06.636 "is_configured": true, 00:19:06.636 "data_offset": 2048, 00:19:06.636 "data_size": 63488 00:19:06.636 } 00:19:06.636 ] 00:19:06.636 } 00:19:06.636 } 00:19:06.636 }' 00:19:06.636 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:06.636 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:06.636 BaseBdev2 00:19:06.636 BaseBdev3 00:19:06.636 BaseBdev4' 00:19:06.636 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:06.636 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:06.636 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:06.895 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:06.895 "name": "BaseBdev1", 00:19:06.895 "aliases": [ 00:19:06.895 "8d254049-7280-4ed0-8699-3f585702abe7" 00:19:06.895 ], 00:19:06.895 "product_name": "Malloc disk", 00:19:06.895 "block_size": 512, 00:19:06.895 "num_blocks": 65536, 00:19:06.895 "uuid": "8d254049-7280-4ed0-8699-3f585702abe7", 00:19:06.895 "assigned_rate_limits": { 00:19:06.895 "rw_ios_per_sec": 0, 00:19:06.895 "rw_mbytes_per_sec": 0, 00:19:06.895 "r_mbytes_per_sec": 0, 00:19:06.895 "w_mbytes_per_sec": 0 00:19:06.895 }, 00:19:06.895 "claimed": true, 00:19:06.895 "claim_type": "exclusive_write", 00:19:06.895 "zoned": false, 00:19:06.895 "supported_io_types": { 00:19:06.895 "read": true, 00:19:06.895 "write": true, 00:19:06.895 "unmap": true, 00:19:06.895 "flush": true, 00:19:06.895 "reset": true, 00:19:06.895 "nvme_admin": false, 00:19:06.895 "nvme_io": false, 00:19:06.895 "nvme_io_md": false, 00:19:06.895 "write_zeroes": true, 00:19:06.895 "zcopy": true, 00:19:06.895 "get_zone_info": false, 00:19:06.895 "zone_management": false, 00:19:06.895 "zone_append": false, 00:19:06.895 "compare": false, 00:19:06.895 "compare_and_write": false, 00:19:06.895 "abort": true, 00:19:06.895 "seek_hole": false, 00:19:06.895 "seek_data": false, 00:19:06.895 "copy": true, 00:19:06.895 "nvme_iov_md": false 00:19:06.895 }, 00:19:06.895 "memory_domains": [ 00:19:06.895 { 00:19:06.895 "dma_device_id": "system", 00:19:06.895 "dma_device_type": 1 00:19:06.895 }, 00:19:06.895 { 00:19:06.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:06.895 "dma_device_type": 2 00:19:06.895 } 00:19:06.895 ], 00:19:06.895 "driver_specific": {} 00:19:06.895 }' 00:19:06.895 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.895 10:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:06.895 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:06.895 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:06.895 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:07.154 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:07.413 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:07.413 "name": "BaseBdev2", 00:19:07.413 "aliases": [ 00:19:07.413 "7c9b0de8-68da-4940-abe2-7a86a64f0bf1" 00:19:07.413 ], 00:19:07.413 "product_name": "Malloc disk", 00:19:07.413 "block_size": 512, 00:19:07.413 "num_blocks": 65536, 00:19:07.413 "uuid": "7c9b0de8-68da-4940-abe2-7a86a64f0bf1", 00:19:07.413 "assigned_rate_limits": { 00:19:07.413 "rw_ios_per_sec": 0, 00:19:07.413 "rw_mbytes_per_sec": 0, 00:19:07.413 "r_mbytes_per_sec": 0, 00:19:07.413 "w_mbytes_per_sec": 0 00:19:07.413 }, 00:19:07.413 "claimed": true, 00:19:07.413 "claim_type": "exclusive_write", 00:19:07.413 "zoned": false, 00:19:07.413 "supported_io_types": { 00:19:07.413 "read": true, 00:19:07.413 "write": true, 00:19:07.413 "unmap": true, 00:19:07.413 "flush": true, 00:19:07.413 "reset": true, 00:19:07.413 "nvme_admin": false, 00:19:07.413 "nvme_io": false, 00:19:07.413 "nvme_io_md": false, 00:19:07.413 "write_zeroes": true, 00:19:07.413 "zcopy": true, 00:19:07.413 "get_zone_info": false, 00:19:07.413 "zone_management": false, 00:19:07.413 "zone_append": false, 00:19:07.413 "compare": false, 00:19:07.413 "compare_and_write": false, 00:19:07.413 "abort": true, 00:19:07.413 "seek_hole": false, 00:19:07.414 "seek_data": false, 00:19:07.414 "copy": true, 00:19:07.414 "nvme_iov_md": false 00:19:07.414 }, 00:19:07.414 "memory_domains": [ 00:19:07.414 { 00:19:07.414 "dma_device_id": "system", 00:19:07.414 "dma_device_type": 1 00:19:07.414 }, 00:19:07.414 { 00:19:07.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.414 "dma_device_type": 2 00:19:07.414 } 00:19:07.414 ], 00:19:07.414 "driver_specific": {} 00:19:07.414 }' 00:19:07.414 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:07.414 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.672 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:07.931 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:07.931 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:07.931 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:07.931 10:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:07.931 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:07.931 "name": "BaseBdev3", 00:19:07.931 "aliases": [ 00:19:07.931 "d9b1374f-3927-46f8-a45d-5988492914c2" 00:19:07.931 ], 00:19:07.931 "product_name": "Malloc disk", 00:19:07.931 "block_size": 512, 00:19:07.931 "num_blocks": 65536, 00:19:07.931 "uuid": "d9b1374f-3927-46f8-a45d-5988492914c2", 00:19:07.931 "assigned_rate_limits": { 00:19:07.931 "rw_ios_per_sec": 0, 00:19:07.931 "rw_mbytes_per_sec": 0, 00:19:07.931 "r_mbytes_per_sec": 0, 00:19:07.931 "w_mbytes_per_sec": 0 00:19:07.931 }, 00:19:07.931 "claimed": true, 00:19:07.931 "claim_type": "exclusive_write", 00:19:07.931 "zoned": false, 00:19:07.931 "supported_io_types": { 00:19:07.931 "read": true, 00:19:07.931 "write": true, 00:19:07.931 "unmap": true, 00:19:07.931 "flush": true, 00:19:07.931 "reset": true, 00:19:07.931 "nvme_admin": false, 00:19:07.931 "nvme_io": false, 00:19:07.931 "nvme_io_md": false, 00:19:07.931 "write_zeroes": true, 00:19:07.931 "zcopy": true, 00:19:07.931 "get_zone_info": false, 00:19:07.931 "zone_management": false, 00:19:07.931 "zone_append": false, 00:19:07.931 "compare": false, 00:19:07.931 "compare_and_write": false, 00:19:07.931 "abort": true, 00:19:07.931 "seek_hole": false, 00:19:07.931 "seek_data": false, 00:19:07.931 "copy": true, 00:19:07.931 "nvme_iov_md": false 00:19:07.931 }, 00:19:07.931 "memory_domains": [ 00:19:07.931 { 00:19:07.931 "dma_device_id": "system", 00:19:07.931 "dma_device_type": 1 00:19:07.931 }, 00:19:07.931 { 00:19:07.931 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.931 "dma_device_type": 2 00:19:07.931 } 00:19:07.931 ], 00:19:07.931 "driver_specific": {} 00:19:07.931 }' 00:19:07.931 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:07.931 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.189 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.447 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:08.448 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:08.448 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:08.448 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:08.707 "name": "BaseBdev4", 00:19:08.707 "aliases": [ 00:19:08.707 "7bfff2dd-a99b-4000-91de-b56cd44a8026" 00:19:08.707 ], 00:19:08.707 "product_name": "Malloc disk", 00:19:08.707 "block_size": 512, 00:19:08.707 "num_blocks": 65536, 00:19:08.707 "uuid": "7bfff2dd-a99b-4000-91de-b56cd44a8026", 00:19:08.707 "assigned_rate_limits": { 00:19:08.707 "rw_ios_per_sec": 0, 00:19:08.707 "rw_mbytes_per_sec": 0, 00:19:08.707 "r_mbytes_per_sec": 0, 00:19:08.707 "w_mbytes_per_sec": 0 00:19:08.707 }, 00:19:08.707 "claimed": true, 00:19:08.707 "claim_type": "exclusive_write", 00:19:08.707 "zoned": false, 00:19:08.707 "supported_io_types": { 00:19:08.707 "read": true, 00:19:08.707 "write": true, 00:19:08.707 "unmap": true, 00:19:08.707 "flush": true, 00:19:08.707 "reset": true, 00:19:08.707 "nvme_admin": false, 00:19:08.707 "nvme_io": false, 00:19:08.707 "nvme_io_md": false, 00:19:08.707 "write_zeroes": true, 00:19:08.707 "zcopy": true, 00:19:08.707 "get_zone_info": false, 00:19:08.707 "zone_management": false, 00:19:08.707 "zone_append": false, 00:19:08.707 "compare": false, 00:19:08.707 "compare_and_write": false, 00:19:08.707 "abort": true, 00:19:08.707 "seek_hole": false, 00:19:08.707 "seek_data": false, 00:19:08.707 "copy": true, 00:19:08.707 "nvme_iov_md": false 00:19:08.707 }, 00:19:08.707 "memory_domains": [ 00:19:08.707 { 00:19:08.707 "dma_device_id": "system", 00:19:08.707 "dma_device_type": 1 00:19:08.707 }, 00:19:08.707 { 00:19:08.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:08.707 "dma_device_type": 2 00:19:08.707 } 00:19:08.707 ], 00:19:08.707 "driver_specific": {} 00:19:08.707 }' 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.707 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:08.966 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:08.966 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.966 10:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:08.966 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:08.966 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:09.224 [2024-07-15 10:26:46.244614] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:09.224 [2024-07-15 10:26:46.244641] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:09.224 [2024-07-15 10:26:46.244689] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.224 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:09.483 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.483 "name": "Existed_Raid", 00:19:09.483 "uuid": "fbaa73fe-dae9-4a4a-9f13-01c74af29e9b", 00:19:09.483 "strip_size_kb": 64, 00:19:09.483 "state": "offline", 00:19:09.483 "raid_level": "raid0", 00:19:09.483 "superblock": true, 00:19:09.483 "num_base_bdevs": 4, 00:19:09.483 "num_base_bdevs_discovered": 3, 00:19:09.483 "num_base_bdevs_operational": 3, 00:19:09.483 "base_bdevs_list": [ 00:19:09.483 { 00:19:09.483 "name": null, 00:19:09.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.483 "is_configured": false, 00:19:09.483 "data_offset": 2048, 00:19:09.483 "data_size": 63488 00:19:09.483 }, 00:19:09.483 { 00:19:09.483 "name": "BaseBdev2", 00:19:09.483 "uuid": "7c9b0de8-68da-4940-abe2-7a86a64f0bf1", 00:19:09.483 "is_configured": true, 00:19:09.483 "data_offset": 2048, 00:19:09.483 "data_size": 63488 00:19:09.483 }, 00:19:09.483 { 00:19:09.483 "name": "BaseBdev3", 00:19:09.483 "uuid": "d9b1374f-3927-46f8-a45d-5988492914c2", 00:19:09.483 "is_configured": true, 00:19:09.483 "data_offset": 2048, 00:19:09.483 "data_size": 63488 00:19:09.483 }, 00:19:09.483 { 00:19:09.483 "name": "BaseBdev4", 00:19:09.483 "uuid": "7bfff2dd-a99b-4000-91de-b56cd44a8026", 00:19:09.483 "is_configured": true, 00:19:09.483 "data_offset": 2048, 00:19:09.483 "data_size": 63488 00:19:09.483 } 00:19:09.483 ] 00:19:09.483 }' 00:19:09.483 10:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.483 10:26:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:10.050 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:10.050 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:10.050 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.050 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:10.308 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:10.308 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:10.308 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:10.588 [2024-07-15 10:26:47.570031] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:10.588 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:10.588 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:10.588 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.588 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:10.846 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:10.846 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:10.846 10:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:11.103 [2024-07-15 10:26:48.063330] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:11.103 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:11.103 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:11.103 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.103 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:11.361 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:11.361 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:11.361 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:11.361 [2024-07-15 10:26:48.500259] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:11.361 [2024-07-15 10:26:48.500300] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22ac350 name Existed_Raid, state offline 00:19:11.361 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:11.361 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:11.361 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.361 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:11.619 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:11.619 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:11.619 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:11.619 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:11.619 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:11.619 10:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:11.877 BaseBdev2 00:19:11.877 10:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:11.877 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:11.877 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:11.877 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:11.877 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:11.877 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:11.877 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:12.135 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:12.393 [ 00:19:12.393 { 00:19:12.393 "name": "BaseBdev2", 00:19:12.393 "aliases": [ 00:19:12.393 "10d5bcb9-b1c0-4697-9c85-51749eb70d35" 00:19:12.393 ], 00:19:12.393 "product_name": "Malloc disk", 00:19:12.393 "block_size": 512, 00:19:12.393 "num_blocks": 65536, 00:19:12.393 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:12.393 "assigned_rate_limits": { 00:19:12.393 "rw_ios_per_sec": 0, 00:19:12.393 "rw_mbytes_per_sec": 0, 00:19:12.393 "r_mbytes_per_sec": 0, 00:19:12.393 "w_mbytes_per_sec": 0 00:19:12.393 }, 00:19:12.393 "claimed": false, 00:19:12.393 "zoned": false, 00:19:12.393 "supported_io_types": { 00:19:12.393 "read": true, 00:19:12.393 "write": true, 00:19:12.393 "unmap": true, 00:19:12.393 "flush": true, 00:19:12.393 "reset": true, 00:19:12.393 "nvme_admin": false, 00:19:12.393 "nvme_io": false, 00:19:12.393 "nvme_io_md": false, 00:19:12.393 "write_zeroes": true, 00:19:12.393 "zcopy": true, 00:19:12.393 "get_zone_info": false, 00:19:12.393 "zone_management": false, 00:19:12.393 "zone_append": false, 00:19:12.393 "compare": false, 00:19:12.393 "compare_and_write": false, 00:19:12.393 "abort": true, 00:19:12.393 "seek_hole": false, 00:19:12.393 "seek_data": false, 00:19:12.393 "copy": true, 00:19:12.393 "nvme_iov_md": false 00:19:12.393 }, 00:19:12.393 "memory_domains": [ 00:19:12.393 { 00:19:12.393 "dma_device_id": "system", 00:19:12.393 "dma_device_type": 1 00:19:12.393 }, 00:19:12.393 { 00:19:12.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.393 "dma_device_type": 2 00:19:12.393 } 00:19:12.393 ], 00:19:12.393 "driver_specific": {} 00:19:12.393 } 00:19:12.393 ] 00:19:12.393 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:12.393 10:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:12.393 10:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:12.393 10:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:12.652 BaseBdev3 00:19:12.652 10:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:12.652 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:12.652 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:12.652 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:12.652 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:12.652 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:12.652 10:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:12.910 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:13.169 [ 00:19:13.169 { 00:19:13.169 "name": "BaseBdev3", 00:19:13.169 "aliases": [ 00:19:13.169 "adbfd94c-7b89-4ff3-97a5-817c0184253d" 00:19:13.169 ], 00:19:13.169 "product_name": "Malloc disk", 00:19:13.169 "block_size": 512, 00:19:13.169 "num_blocks": 65536, 00:19:13.169 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:13.169 "assigned_rate_limits": { 00:19:13.169 "rw_ios_per_sec": 0, 00:19:13.169 "rw_mbytes_per_sec": 0, 00:19:13.169 "r_mbytes_per_sec": 0, 00:19:13.169 "w_mbytes_per_sec": 0 00:19:13.169 }, 00:19:13.169 "claimed": false, 00:19:13.169 "zoned": false, 00:19:13.169 "supported_io_types": { 00:19:13.169 "read": true, 00:19:13.169 "write": true, 00:19:13.169 "unmap": true, 00:19:13.169 "flush": true, 00:19:13.169 "reset": true, 00:19:13.169 "nvme_admin": false, 00:19:13.169 "nvme_io": false, 00:19:13.169 "nvme_io_md": false, 00:19:13.169 "write_zeroes": true, 00:19:13.169 "zcopy": true, 00:19:13.169 "get_zone_info": false, 00:19:13.169 "zone_management": false, 00:19:13.169 "zone_append": false, 00:19:13.169 "compare": false, 00:19:13.169 "compare_and_write": false, 00:19:13.169 "abort": true, 00:19:13.169 "seek_hole": false, 00:19:13.169 "seek_data": false, 00:19:13.169 "copy": true, 00:19:13.169 "nvme_iov_md": false 00:19:13.169 }, 00:19:13.169 "memory_domains": [ 00:19:13.169 { 00:19:13.169 "dma_device_id": "system", 00:19:13.169 "dma_device_type": 1 00:19:13.169 }, 00:19:13.169 { 00:19:13.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.169 "dma_device_type": 2 00:19:13.169 } 00:19:13.169 ], 00:19:13.169 "driver_specific": {} 00:19:13.169 } 00:19:13.169 ] 00:19:13.169 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:13.169 10:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:13.169 10:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:13.169 10:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:13.428 BaseBdev4 00:19:13.428 10:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:13.428 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:13.428 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:13.428 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:13.428 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:13.428 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:13.428 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:13.687 10:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:13.945 [ 00:19:13.945 { 00:19:13.945 "name": "BaseBdev4", 00:19:13.945 "aliases": [ 00:19:13.945 "26127136-ff77-4179-b54f-a28712145bd1" 00:19:13.945 ], 00:19:13.945 "product_name": "Malloc disk", 00:19:13.945 "block_size": 512, 00:19:13.945 "num_blocks": 65536, 00:19:13.945 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:13.945 "assigned_rate_limits": { 00:19:13.945 "rw_ios_per_sec": 0, 00:19:13.945 "rw_mbytes_per_sec": 0, 00:19:13.945 "r_mbytes_per_sec": 0, 00:19:13.945 "w_mbytes_per_sec": 0 00:19:13.945 }, 00:19:13.945 "claimed": false, 00:19:13.945 "zoned": false, 00:19:13.945 "supported_io_types": { 00:19:13.945 "read": true, 00:19:13.945 "write": true, 00:19:13.945 "unmap": true, 00:19:13.945 "flush": true, 00:19:13.945 "reset": true, 00:19:13.945 "nvme_admin": false, 00:19:13.945 "nvme_io": false, 00:19:13.945 "nvme_io_md": false, 00:19:13.945 "write_zeroes": true, 00:19:13.945 "zcopy": true, 00:19:13.945 "get_zone_info": false, 00:19:13.945 "zone_management": false, 00:19:13.945 "zone_append": false, 00:19:13.945 "compare": false, 00:19:13.945 "compare_and_write": false, 00:19:13.945 "abort": true, 00:19:13.945 "seek_hole": false, 00:19:13.945 "seek_data": false, 00:19:13.945 "copy": true, 00:19:13.945 "nvme_iov_md": false 00:19:13.945 }, 00:19:13.945 "memory_domains": [ 00:19:13.945 { 00:19:13.945 "dma_device_id": "system", 00:19:13.945 "dma_device_type": 1 00:19:13.945 }, 00:19:13.945 { 00:19:13.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.945 "dma_device_type": 2 00:19:13.945 } 00:19:13.945 ], 00:19:13.945 "driver_specific": {} 00:19:13.945 } 00:19:13.945 ] 00:19:13.945 10:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:13.945 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:13.945 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:13.945 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:14.205 [2024-07-15 10:26:51.228677] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:14.205 [2024-07-15 10:26:51.228719] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:14.205 [2024-07-15 10:26:51.228738] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:14.205 [2024-07-15 10:26:51.230103] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:14.205 [2024-07-15 10:26:51.230142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.205 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.463 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.463 "name": "Existed_Raid", 00:19:14.463 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:14.463 "strip_size_kb": 64, 00:19:14.463 "state": "configuring", 00:19:14.463 "raid_level": "raid0", 00:19:14.463 "superblock": true, 00:19:14.463 "num_base_bdevs": 4, 00:19:14.463 "num_base_bdevs_discovered": 3, 00:19:14.463 "num_base_bdevs_operational": 4, 00:19:14.463 "base_bdevs_list": [ 00:19:14.463 { 00:19:14.463 "name": "BaseBdev1", 00:19:14.463 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.463 "is_configured": false, 00:19:14.463 "data_offset": 0, 00:19:14.463 "data_size": 0 00:19:14.463 }, 00:19:14.463 { 00:19:14.463 "name": "BaseBdev2", 00:19:14.463 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:14.463 "is_configured": true, 00:19:14.463 "data_offset": 2048, 00:19:14.463 "data_size": 63488 00:19:14.463 }, 00:19:14.463 { 00:19:14.463 "name": "BaseBdev3", 00:19:14.463 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:14.463 "is_configured": true, 00:19:14.463 "data_offset": 2048, 00:19:14.463 "data_size": 63488 00:19:14.463 }, 00:19:14.463 { 00:19:14.463 "name": "BaseBdev4", 00:19:14.463 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:14.463 "is_configured": true, 00:19:14.463 "data_offset": 2048, 00:19:14.463 "data_size": 63488 00:19:14.463 } 00:19:14.463 ] 00:19:14.463 }' 00:19:14.463 10:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.463 10:26:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:15.049 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:15.049 [2024-07-15 10:26:52.243320] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.308 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.566 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.566 "name": "Existed_Raid", 00:19:15.566 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:15.566 "strip_size_kb": 64, 00:19:15.566 "state": "configuring", 00:19:15.566 "raid_level": "raid0", 00:19:15.566 "superblock": true, 00:19:15.566 "num_base_bdevs": 4, 00:19:15.566 "num_base_bdevs_discovered": 2, 00:19:15.566 "num_base_bdevs_operational": 4, 00:19:15.566 "base_bdevs_list": [ 00:19:15.566 { 00:19:15.566 "name": "BaseBdev1", 00:19:15.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.566 "is_configured": false, 00:19:15.566 "data_offset": 0, 00:19:15.566 "data_size": 0 00:19:15.567 }, 00:19:15.567 { 00:19:15.567 "name": null, 00:19:15.567 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:15.567 "is_configured": false, 00:19:15.567 "data_offset": 2048, 00:19:15.567 "data_size": 63488 00:19:15.567 }, 00:19:15.567 { 00:19:15.567 "name": "BaseBdev3", 00:19:15.567 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:15.567 "is_configured": true, 00:19:15.567 "data_offset": 2048, 00:19:15.567 "data_size": 63488 00:19:15.567 }, 00:19:15.567 { 00:19:15.567 "name": "BaseBdev4", 00:19:15.567 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:15.567 "is_configured": true, 00:19:15.567 "data_offset": 2048, 00:19:15.567 "data_size": 63488 00:19:15.567 } 00:19:15.567 ] 00:19:15.567 }' 00:19:15.567 10:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.567 10:26:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:16.134 10:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.134 10:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:16.391 10:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:16.391 10:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:16.650 [2024-07-15 10:26:53.591452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:16.650 BaseBdev1 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:16.650 10:26:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:16.908 [ 00:19:16.908 { 00:19:16.908 "name": "BaseBdev1", 00:19:16.908 "aliases": [ 00:19:16.908 "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1" 00:19:16.908 ], 00:19:16.908 "product_name": "Malloc disk", 00:19:16.908 "block_size": 512, 00:19:16.908 "num_blocks": 65536, 00:19:16.908 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:16.908 "assigned_rate_limits": { 00:19:16.908 "rw_ios_per_sec": 0, 00:19:16.908 "rw_mbytes_per_sec": 0, 00:19:16.908 "r_mbytes_per_sec": 0, 00:19:16.908 "w_mbytes_per_sec": 0 00:19:16.908 }, 00:19:16.908 "claimed": true, 00:19:16.908 "claim_type": "exclusive_write", 00:19:16.908 "zoned": false, 00:19:16.908 "supported_io_types": { 00:19:16.908 "read": true, 00:19:16.908 "write": true, 00:19:16.908 "unmap": true, 00:19:16.908 "flush": true, 00:19:16.908 "reset": true, 00:19:16.908 "nvme_admin": false, 00:19:16.908 "nvme_io": false, 00:19:16.908 "nvme_io_md": false, 00:19:16.908 "write_zeroes": true, 00:19:16.908 "zcopy": true, 00:19:16.908 "get_zone_info": false, 00:19:16.908 "zone_management": false, 00:19:16.908 "zone_append": false, 00:19:16.908 "compare": false, 00:19:16.908 "compare_and_write": false, 00:19:16.908 "abort": true, 00:19:16.908 "seek_hole": false, 00:19:16.908 "seek_data": false, 00:19:16.908 "copy": true, 00:19:16.908 "nvme_iov_md": false 00:19:16.908 }, 00:19:16.908 "memory_domains": [ 00:19:16.908 { 00:19:16.908 "dma_device_id": "system", 00:19:16.908 "dma_device_type": 1 00:19:16.908 }, 00:19:16.908 { 00:19:16.908 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:16.908 "dma_device_type": 2 00:19:16.908 } 00:19:16.908 ], 00:19:16.908 "driver_specific": {} 00:19:16.908 } 00:19:16.908 ] 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.908 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.167 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.167 "name": "Existed_Raid", 00:19:17.167 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:17.167 "strip_size_kb": 64, 00:19:17.167 "state": "configuring", 00:19:17.167 "raid_level": "raid0", 00:19:17.167 "superblock": true, 00:19:17.167 "num_base_bdevs": 4, 00:19:17.167 "num_base_bdevs_discovered": 3, 00:19:17.167 "num_base_bdevs_operational": 4, 00:19:17.167 "base_bdevs_list": [ 00:19:17.167 { 00:19:17.167 "name": "BaseBdev1", 00:19:17.167 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:17.167 "is_configured": true, 00:19:17.167 "data_offset": 2048, 00:19:17.167 "data_size": 63488 00:19:17.167 }, 00:19:17.167 { 00:19:17.167 "name": null, 00:19:17.167 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:17.167 "is_configured": false, 00:19:17.167 "data_offset": 2048, 00:19:17.167 "data_size": 63488 00:19:17.167 }, 00:19:17.167 { 00:19:17.167 "name": "BaseBdev3", 00:19:17.167 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:17.167 "is_configured": true, 00:19:17.167 "data_offset": 2048, 00:19:17.167 "data_size": 63488 00:19:17.167 }, 00:19:17.167 { 00:19:17.167 "name": "BaseBdev4", 00:19:17.167 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:17.167 "is_configured": true, 00:19:17.167 "data_offset": 2048, 00:19:17.167 "data_size": 63488 00:19:17.167 } 00:19:17.167 ] 00:19:17.167 }' 00:19:17.167 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.167 10:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:18.104 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.104 10:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:18.104 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:18.104 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:18.363 [2024-07-15 10:26:55.352118] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.363 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:18.623 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.623 "name": "Existed_Raid", 00:19:18.623 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:18.623 "strip_size_kb": 64, 00:19:18.623 "state": "configuring", 00:19:18.623 "raid_level": "raid0", 00:19:18.623 "superblock": true, 00:19:18.623 "num_base_bdevs": 4, 00:19:18.623 "num_base_bdevs_discovered": 2, 00:19:18.623 "num_base_bdevs_operational": 4, 00:19:18.623 "base_bdevs_list": [ 00:19:18.623 { 00:19:18.623 "name": "BaseBdev1", 00:19:18.623 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:18.623 "is_configured": true, 00:19:18.623 "data_offset": 2048, 00:19:18.623 "data_size": 63488 00:19:18.623 }, 00:19:18.623 { 00:19:18.623 "name": null, 00:19:18.623 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:18.623 "is_configured": false, 00:19:18.623 "data_offset": 2048, 00:19:18.623 "data_size": 63488 00:19:18.623 }, 00:19:18.623 { 00:19:18.623 "name": null, 00:19:18.623 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:18.623 "is_configured": false, 00:19:18.623 "data_offset": 2048, 00:19:18.623 "data_size": 63488 00:19:18.623 }, 00:19:18.623 { 00:19:18.623 "name": "BaseBdev4", 00:19:18.623 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:18.623 "is_configured": true, 00:19:18.623 "data_offset": 2048, 00:19:18.623 "data_size": 63488 00:19:18.623 } 00:19:18.623 ] 00:19:18.623 }' 00:19:18.623 10:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.623 10:26:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:19.190 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.190 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:19.449 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:19.449 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:19.709 [2024-07-15 10:26:56.659604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.709 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.968 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.968 "name": "Existed_Raid", 00:19:19.968 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:19.968 "strip_size_kb": 64, 00:19:19.968 "state": "configuring", 00:19:19.968 "raid_level": "raid0", 00:19:19.968 "superblock": true, 00:19:19.968 "num_base_bdevs": 4, 00:19:19.968 "num_base_bdevs_discovered": 3, 00:19:19.968 "num_base_bdevs_operational": 4, 00:19:19.968 "base_bdevs_list": [ 00:19:19.968 { 00:19:19.968 "name": "BaseBdev1", 00:19:19.968 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:19.968 "is_configured": true, 00:19:19.968 "data_offset": 2048, 00:19:19.968 "data_size": 63488 00:19:19.968 }, 00:19:19.968 { 00:19:19.968 "name": null, 00:19:19.968 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:19.968 "is_configured": false, 00:19:19.968 "data_offset": 2048, 00:19:19.968 "data_size": 63488 00:19:19.968 }, 00:19:19.968 { 00:19:19.968 "name": "BaseBdev3", 00:19:19.968 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:19.968 "is_configured": true, 00:19:19.968 "data_offset": 2048, 00:19:19.968 "data_size": 63488 00:19:19.968 }, 00:19:19.968 { 00:19:19.968 "name": "BaseBdev4", 00:19:19.968 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:19.968 "is_configured": true, 00:19:19.968 "data_offset": 2048, 00:19:19.968 "data_size": 63488 00:19:19.968 } 00:19:19.968 ] 00:19:19.968 }' 00:19:19.968 10:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.968 10:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:20.536 10:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:20.536 10:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.795 10:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:20.795 10:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:20.795 [2024-07-15 10:26:57.983132] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:21.067 "name": "Existed_Raid", 00:19:21.067 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:21.067 "strip_size_kb": 64, 00:19:21.067 "state": "configuring", 00:19:21.067 "raid_level": "raid0", 00:19:21.067 "superblock": true, 00:19:21.067 "num_base_bdevs": 4, 00:19:21.067 "num_base_bdevs_discovered": 2, 00:19:21.067 "num_base_bdevs_operational": 4, 00:19:21.067 "base_bdevs_list": [ 00:19:21.067 { 00:19:21.067 "name": null, 00:19:21.067 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:21.067 "is_configured": false, 00:19:21.067 "data_offset": 2048, 00:19:21.067 "data_size": 63488 00:19:21.067 }, 00:19:21.067 { 00:19:21.067 "name": null, 00:19:21.067 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:21.067 "is_configured": false, 00:19:21.067 "data_offset": 2048, 00:19:21.067 "data_size": 63488 00:19:21.067 }, 00:19:21.067 { 00:19:21.067 "name": "BaseBdev3", 00:19:21.067 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:21.067 "is_configured": true, 00:19:21.067 "data_offset": 2048, 00:19:21.067 "data_size": 63488 00:19:21.067 }, 00:19:21.067 { 00:19:21.067 "name": "BaseBdev4", 00:19:21.067 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:21.067 "is_configured": true, 00:19:21.067 "data_offset": 2048, 00:19:21.067 "data_size": 63488 00:19:21.067 } 00:19:21.067 ] 00:19:21.067 }' 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:21.067 10:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:22.000 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.000 10:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:22.000 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:22.000 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:22.258 [2024-07-15 10:26:59.337062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.258 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.516 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.516 "name": "Existed_Raid", 00:19:22.516 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:22.516 "strip_size_kb": 64, 00:19:22.516 "state": "configuring", 00:19:22.516 "raid_level": "raid0", 00:19:22.516 "superblock": true, 00:19:22.516 "num_base_bdevs": 4, 00:19:22.516 "num_base_bdevs_discovered": 3, 00:19:22.516 "num_base_bdevs_operational": 4, 00:19:22.516 "base_bdevs_list": [ 00:19:22.516 { 00:19:22.516 "name": null, 00:19:22.516 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:22.516 "is_configured": false, 00:19:22.516 "data_offset": 2048, 00:19:22.516 "data_size": 63488 00:19:22.516 }, 00:19:22.516 { 00:19:22.516 "name": "BaseBdev2", 00:19:22.516 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:22.516 "is_configured": true, 00:19:22.516 "data_offset": 2048, 00:19:22.516 "data_size": 63488 00:19:22.516 }, 00:19:22.516 { 00:19:22.516 "name": "BaseBdev3", 00:19:22.516 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:22.516 "is_configured": true, 00:19:22.516 "data_offset": 2048, 00:19:22.516 "data_size": 63488 00:19:22.516 }, 00:19:22.516 { 00:19:22.516 "name": "BaseBdev4", 00:19:22.516 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:22.516 "is_configured": true, 00:19:22.516 "data_offset": 2048, 00:19:22.516 "data_size": 63488 00:19:22.516 } 00:19:22.516 ] 00:19:22.516 }' 00:19:22.516 10:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.516 10:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:23.081 10:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.081 10:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:23.339 10:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:23.339 10:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.339 10:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:23.597 10:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 994a08ba-8cf4-43ab-ada2-9d23d1fa76c1 00:19:23.856 [2024-07-15 10:27:00.944784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:23.856 [2024-07-15 10:27:00.944956] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22b2470 00:19:23.856 [2024-07-15 10:27:00.944970] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:23.856 [2024-07-15 10:27:00.945144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22a2c40 00:19:23.856 [2024-07-15 10:27:00.945257] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22b2470 00:19:23.856 [2024-07-15 10:27:00.945267] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22b2470 00:19:23.856 [2024-07-15 10:27:00.945360] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:23.856 NewBaseBdev 00:19:23.856 10:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:23.856 10:27:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:23.856 10:27:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:23.856 10:27:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:23.856 10:27:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:23.856 10:27:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:23.856 10:27:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:24.114 10:27:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:24.372 [ 00:19:24.372 { 00:19:24.372 "name": "NewBaseBdev", 00:19:24.372 "aliases": [ 00:19:24.372 "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1" 00:19:24.372 ], 00:19:24.372 "product_name": "Malloc disk", 00:19:24.372 "block_size": 512, 00:19:24.372 "num_blocks": 65536, 00:19:24.372 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:24.372 "assigned_rate_limits": { 00:19:24.372 "rw_ios_per_sec": 0, 00:19:24.372 "rw_mbytes_per_sec": 0, 00:19:24.372 "r_mbytes_per_sec": 0, 00:19:24.372 "w_mbytes_per_sec": 0 00:19:24.372 }, 00:19:24.372 "claimed": true, 00:19:24.372 "claim_type": "exclusive_write", 00:19:24.372 "zoned": false, 00:19:24.372 "supported_io_types": { 00:19:24.372 "read": true, 00:19:24.372 "write": true, 00:19:24.372 "unmap": true, 00:19:24.372 "flush": true, 00:19:24.372 "reset": true, 00:19:24.372 "nvme_admin": false, 00:19:24.372 "nvme_io": false, 00:19:24.372 "nvme_io_md": false, 00:19:24.372 "write_zeroes": true, 00:19:24.372 "zcopy": true, 00:19:24.372 "get_zone_info": false, 00:19:24.372 "zone_management": false, 00:19:24.372 "zone_append": false, 00:19:24.372 "compare": false, 00:19:24.372 "compare_and_write": false, 00:19:24.372 "abort": true, 00:19:24.372 "seek_hole": false, 00:19:24.372 "seek_data": false, 00:19:24.372 "copy": true, 00:19:24.372 "nvme_iov_md": false 00:19:24.372 }, 00:19:24.372 "memory_domains": [ 00:19:24.372 { 00:19:24.372 "dma_device_id": "system", 00:19:24.372 "dma_device_type": 1 00:19:24.372 }, 00:19:24.372 { 00:19:24.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.372 "dma_device_type": 2 00:19:24.372 } 00:19:24.372 ], 00:19:24.372 "driver_specific": {} 00:19:24.372 } 00:19:24.372 ] 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.372 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.630 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.630 "name": "Existed_Raid", 00:19:24.630 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:24.630 "strip_size_kb": 64, 00:19:24.630 "state": "online", 00:19:24.630 "raid_level": "raid0", 00:19:24.630 "superblock": true, 00:19:24.630 "num_base_bdevs": 4, 00:19:24.630 "num_base_bdevs_discovered": 4, 00:19:24.630 "num_base_bdevs_operational": 4, 00:19:24.630 "base_bdevs_list": [ 00:19:24.630 { 00:19:24.630 "name": "NewBaseBdev", 00:19:24.630 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:24.630 "is_configured": true, 00:19:24.630 "data_offset": 2048, 00:19:24.630 "data_size": 63488 00:19:24.630 }, 00:19:24.630 { 00:19:24.630 "name": "BaseBdev2", 00:19:24.630 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:24.630 "is_configured": true, 00:19:24.630 "data_offset": 2048, 00:19:24.630 "data_size": 63488 00:19:24.630 }, 00:19:24.630 { 00:19:24.630 "name": "BaseBdev3", 00:19:24.630 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:24.630 "is_configured": true, 00:19:24.630 "data_offset": 2048, 00:19:24.630 "data_size": 63488 00:19:24.630 }, 00:19:24.630 { 00:19:24.630 "name": "BaseBdev4", 00:19:24.630 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:24.630 "is_configured": true, 00:19:24.630 "data_offset": 2048, 00:19:24.630 "data_size": 63488 00:19:24.630 } 00:19:24.630 ] 00:19:24.630 }' 00:19:24.630 10:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.630 10:27:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:25.196 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:25.454 [2024-07-15 10:27:02.453122] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:25.454 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:25.454 "name": "Existed_Raid", 00:19:25.454 "aliases": [ 00:19:25.454 "77988c13-6e80-4889-8058-355eb367e173" 00:19:25.454 ], 00:19:25.454 "product_name": "Raid Volume", 00:19:25.454 "block_size": 512, 00:19:25.454 "num_blocks": 253952, 00:19:25.454 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:25.454 "assigned_rate_limits": { 00:19:25.454 "rw_ios_per_sec": 0, 00:19:25.454 "rw_mbytes_per_sec": 0, 00:19:25.454 "r_mbytes_per_sec": 0, 00:19:25.454 "w_mbytes_per_sec": 0 00:19:25.454 }, 00:19:25.454 "claimed": false, 00:19:25.454 "zoned": false, 00:19:25.454 "supported_io_types": { 00:19:25.454 "read": true, 00:19:25.454 "write": true, 00:19:25.454 "unmap": true, 00:19:25.454 "flush": true, 00:19:25.454 "reset": true, 00:19:25.454 "nvme_admin": false, 00:19:25.454 "nvme_io": false, 00:19:25.454 "nvme_io_md": false, 00:19:25.454 "write_zeroes": true, 00:19:25.455 "zcopy": false, 00:19:25.455 "get_zone_info": false, 00:19:25.455 "zone_management": false, 00:19:25.455 "zone_append": false, 00:19:25.455 "compare": false, 00:19:25.455 "compare_and_write": false, 00:19:25.455 "abort": false, 00:19:25.455 "seek_hole": false, 00:19:25.455 "seek_data": false, 00:19:25.455 "copy": false, 00:19:25.455 "nvme_iov_md": false 00:19:25.455 }, 00:19:25.455 "memory_domains": [ 00:19:25.455 { 00:19:25.455 "dma_device_id": "system", 00:19:25.455 "dma_device_type": 1 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.455 "dma_device_type": 2 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "dma_device_id": "system", 00:19:25.455 "dma_device_type": 1 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.455 "dma_device_type": 2 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "dma_device_id": "system", 00:19:25.455 "dma_device_type": 1 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.455 "dma_device_type": 2 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "dma_device_id": "system", 00:19:25.455 "dma_device_type": 1 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.455 "dma_device_type": 2 00:19:25.455 } 00:19:25.455 ], 00:19:25.455 "driver_specific": { 00:19:25.455 "raid": { 00:19:25.455 "uuid": "77988c13-6e80-4889-8058-355eb367e173", 00:19:25.455 "strip_size_kb": 64, 00:19:25.455 "state": "online", 00:19:25.455 "raid_level": "raid0", 00:19:25.455 "superblock": true, 00:19:25.455 "num_base_bdevs": 4, 00:19:25.455 "num_base_bdevs_discovered": 4, 00:19:25.455 "num_base_bdevs_operational": 4, 00:19:25.455 "base_bdevs_list": [ 00:19:25.455 { 00:19:25.455 "name": "NewBaseBdev", 00:19:25.455 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:25.455 "is_configured": true, 00:19:25.455 "data_offset": 2048, 00:19:25.455 "data_size": 63488 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "name": "BaseBdev2", 00:19:25.455 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:25.455 "is_configured": true, 00:19:25.455 "data_offset": 2048, 00:19:25.455 "data_size": 63488 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "name": "BaseBdev3", 00:19:25.455 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:25.455 "is_configured": true, 00:19:25.455 "data_offset": 2048, 00:19:25.455 "data_size": 63488 00:19:25.455 }, 00:19:25.455 { 00:19:25.455 "name": "BaseBdev4", 00:19:25.455 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:25.455 "is_configured": true, 00:19:25.455 "data_offset": 2048, 00:19:25.455 "data_size": 63488 00:19:25.455 } 00:19:25.455 ] 00:19:25.455 } 00:19:25.455 } 00:19:25.455 }' 00:19:25.455 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:25.455 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:25.455 BaseBdev2 00:19:25.455 BaseBdev3 00:19:25.455 BaseBdev4' 00:19:25.455 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:25.455 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:25.455 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:25.715 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:25.715 "name": "NewBaseBdev", 00:19:25.715 "aliases": [ 00:19:25.715 "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1" 00:19:25.715 ], 00:19:25.715 "product_name": "Malloc disk", 00:19:25.715 "block_size": 512, 00:19:25.715 "num_blocks": 65536, 00:19:25.715 "uuid": "994a08ba-8cf4-43ab-ada2-9d23d1fa76c1", 00:19:25.715 "assigned_rate_limits": { 00:19:25.715 "rw_ios_per_sec": 0, 00:19:25.715 "rw_mbytes_per_sec": 0, 00:19:25.715 "r_mbytes_per_sec": 0, 00:19:25.715 "w_mbytes_per_sec": 0 00:19:25.715 }, 00:19:25.715 "claimed": true, 00:19:25.715 "claim_type": "exclusive_write", 00:19:25.715 "zoned": false, 00:19:25.715 "supported_io_types": { 00:19:25.715 "read": true, 00:19:25.715 "write": true, 00:19:25.715 "unmap": true, 00:19:25.715 "flush": true, 00:19:25.715 "reset": true, 00:19:25.715 "nvme_admin": false, 00:19:25.715 "nvme_io": false, 00:19:25.715 "nvme_io_md": false, 00:19:25.715 "write_zeroes": true, 00:19:25.715 "zcopy": true, 00:19:25.715 "get_zone_info": false, 00:19:25.715 "zone_management": false, 00:19:25.715 "zone_append": false, 00:19:25.715 "compare": false, 00:19:25.715 "compare_and_write": false, 00:19:25.715 "abort": true, 00:19:25.715 "seek_hole": false, 00:19:25.715 "seek_data": false, 00:19:25.715 "copy": true, 00:19:25.715 "nvme_iov_md": false 00:19:25.715 }, 00:19:25.715 "memory_domains": [ 00:19:25.715 { 00:19:25.715 "dma_device_id": "system", 00:19:25.715 "dma_device_type": 1 00:19:25.715 }, 00:19:25.715 { 00:19:25.715 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.715 "dma_device_type": 2 00:19:25.715 } 00:19:25.715 ], 00:19:25.715 "driver_specific": {} 00:19:25.715 }' 00:19:25.715 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.715 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.715 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.715 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.973 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.973 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.973 10:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:25.973 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:26.234 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:26.234 "name": "BaseBdev2", 00:19:26.234 "aliases": [ 00:19:26.234 "10d5bcb9-b1c0-4697-9c85-51749eb70d35" 00:19:26.234 ], 00:19:26.234 "product_name": "Malloc disk", 00:19:26.234 "block_size": 512, 00:19:26.234 "num_blocks": 65536, 00:19:26.234 "uuid": "10d5bcb9-b1c0-4697-9c85-51749eb70d35", 00:19:26.234 "assigned_rate_limits": { 00:19:26.234 "rw_ios_per_sec": 0, 00:19:26.234 "rw_mbytes_per_sec": 0, 00:19:26.234 "r_mbytes_per_sec": 0, 00:19:26.234 "w_mbytes_per_sec": 0 00:19:26.234 }, 00:19:26.234 "claimed": true, 00:19:26.234 "claim_type": "exclusive_write", 00:19:26.234 "zoned": false, 00:19:26.234 "supported_io_types": { 00:19:26.234 "read": true, 00:19:26.234 "write": true, 00:19:26.234 "unmap": true, 00:19:26.234 "flush": true, 00:19:26.234 "reset": true, 00:19:26.234 "nvme_admin": false, 00:19:26.234 "nvme_io": false, 00:19:26.234 "nvme_io_md": false, 00:19:26.234 "write_zeroes": true, 00:19:26.234 "zcopy": true, 00:19:26.234 "get_zone_info": false, 00:19:26.234 "zone_management": false, 00:19:26.234 "zone_append": false, 00:19:26.234 "compare": false, 00:19:26.234 "compare_and_write": false, 00:19:26.234 "abort": true, 00:19:26.234 "seek_hole": false, 00:19:26.234 "seek_data": false, 00:19:26.234 "copy": true, 00:19:26.234 "nvme_iov_md": false 00:19:26.234 }, 00:19:26.234 "memory_domains": [ 00:19:26.234 { 00:19:26.234 "dma_device_id": "system", 00:19:26.234 "dma_device_type": 1 00:19:26.234 }, 00:19:26.234 { 00:19:26.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:26.234 "dma_device_type": 2 00:19:26.234 } 00:19:26.234 ], 00:19:26.234 "driver_specific": {} 00:19:26.234 }' 00:19:26.234 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:26.494 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:26.752 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:26.752 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:26.752 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:26.752 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:26.752 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:27.010 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:27.010 "name": "BaseBdev3", 00:19:27.010 "aliases": [ 00:19:27.010 "adbfd94c-7b89-4ff3-97a5-817c0184253d" 00:19:27.010 ], 00:19:27.010 "product_name": "Malloc disk", 00:19:27.010 "block_size": 512, 00:19:27.010 "num_blocks": 65536, 00:19:27.010 "uuid": "adbfd94c-7b89-4ff3-97a5-817c0184253d", 00:19:27.010 "assigned_rate_limits": { 00:19:27.010 "rw_ios_per_sec": 0, 00:19:27.010 "rw_mbytes_per_sec": 0, 00:19:27.010 "r_mbytes_per_sec": 0, 00:19:27.010 "w_mbytes_per_sec": 0 00:19:27.010 }, 00:19:27.010 "claimed": true, 00:19:27.010 "claim_type": "exclusive_write", 00:19:27.010 "zoned": false, 00:19:27.010 "supported_io_types": { 00:19:27.010 "read": true, 00:19:27.010 "write": true, 00:19:27.010 "unmap": true, 00:19:27.010 "flush": true, 00:19:27.010 "reset": true, 00:19:27.010 "nvme_admin": false, 00:19:27.010 "nvme_io": false, 00:19:27.010 "nvme_io_md": false, 00:19:27.010 "write_zeroes": true, 00:19:27.010 "zcopy": true, 00:19:27.010 "get_zone_info": false, 00:19:27.010 "zone_management": false, 00:19:27.010 "zone_append": false, 00:19:27.010 "compare": false, 00:19:27.010 "compare_and_write": false, 00:19:27.010 "abort": true, 00:19:27.010 "seek_hole": false, 00:19:27.010 "seek_data": false, 00:19:27.010 "copy": true, 00:19:27.010 "nvme_iov_md": false 00:19:27.010 }, 00:19:27.010 "memory_domains": [ 00:19:27.010 { 00:19:27.010 "dma_device_id": "system", 00:19:27.010 "dma_device_type": 1 00:19:27.010 }, 00:19:27.010 { 00:19:27.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.010 "dma_device_type": 2 00:19:27.010 } 00:19:27.010 ], 00:19:27.010 "driver_specific": {} 00:19:27.010 }' 00:19:27.010 10:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.010 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.010 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:27.010 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:27.010 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:27.010 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:27.010 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:27.308 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:27.622 "name": "BaseBdev4", 00:19:27.622 "aliases": [ 00:19:27.622 "26127136-ff77-4179-b54f-a28712145bd1" 00:19:27.622 ], 00:19:27.622 "product_name": "Malloc disk", 00:19:27.622 "block_size": 512, 00:19:27.622 "num_blocks": 65536, 00:19:27.622 "uuid": "26127136-ff77-4179-b54f-a28712145bd1", 00:19:27.622 "assigned_rate_limits": { 00:19:27.622 "rw_ios_per_sec": 0, 00:19:27.622 "rw_mbytes_per_sec": 0, 00:19:27.622 "r_mbytes_per_sec": 0, 00:19:27.622 "w_mbytes_per_sec": 0 00:19:27.622 }, 00:19:27.622 "claimed": true, 00:19:27.622 "claim_type": "exclusive_write", 00:19:27.622 "zoned": false, 00:19:27.622 "supported_io_types": { 00:19:27.622 "read": true, 00:19:27.622 "write": true, 00:19:27.622 "unmap": true, 00:19:27.622 "flush": true, 00:19:27.622 "reset": true, 00:19:27.622 "nvme_admin": false, 00:19:27.622 "nvme_io": false, 00:19:27.622 "nvme_io_md": false, 00:19:27.622 "write_zeroes": true, 00:19:27.622 "zcopy": true, 00:19:27.622 "get_zone_info": false, 00:19:27.622 "zone_management": false, 00:19:27.622 "zone_append": false, 00:19:27.622 "compare": false, 00:19:27.622 "compare_and_write": false, 00:19:27.622 "abort": true, 00:19:27.622 "seek_hole": false, 00:19:27.622 "seek_data": false, 00:19:27.622 "copy": true, 00:19:27.622 "nvme_iov_md": false 00:19:27.622 }, 00:19:27.622 "memory_domains": [ 00:19:27.622 { 00:19:27.622 "dma_device_id": "system", 00:19:27.622 "dma_device_type": 1 00:19:27.622 }, 00:19:27.622 { 00:19:27.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:27.622 "dma_device_type": 2 00:19:27.622 } 00:19:27.622 ], 00:19:27.622 "driver_specific": {} 00:19:27.622 }' 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:27.622 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.881 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:27.881 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:27.881 10:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:28.141 [2024-07-15 10:27:05.115877] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:28.141 [2024-07-15 10:27:05.115903] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:28.141 [2024-07-15 10:27:05.115963] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:28.141 [2024-07-15 10:27:05.116030] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:28.141 [2024-07-15 10:27:05.116043] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22b2470 name Existed_Raid, state offline 00:19:28.141 10:27:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 540208 00:19:28.141 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 540208 ']' 00:19:28.141 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 540208 00:19:28.141 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:19:28.141 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:28.141 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 540208 00:19:28.141 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:28.142 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:28.142 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 540208' 00:19:28.142 killing process with pid 540208 00:19:28.142 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 540208 00:19:28.142 [2024-07-15 10:27:05.187204] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:28.142 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 540208 00:19:28.142 [2024-07-15 10:27:05.224485] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:28.401 10:27:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:28.401 00:19:28.401 real 0m32.126s 00:19:28.401 user 0m59.036s 00:19:28.401 sys 0m5.709s 00:19:28.401 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:28.401 10:27:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:28.401 ************************************ 00:19:28.401 END TEST raid_state_function_test_sb 00:19:28.401 ************************************ 00:19:28.401 10:27:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:28.402 10:27:05 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:19:28.402 10:27:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:19:28.402 10:27:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:28.402 10:27:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:28.402 ************************************ 00:19:28.402 START TEST raid_superblock_test 00:19:28.402 ************************************ 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=545414 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 545414 /var/tmp/spdk-raid.sock 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 545414 ']' 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:28.402 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.402 10:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:28.402 [2024-07-15 10:27:05.581289] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:19:28.402 [2024-07-15 10:27:05.581358] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid545414 ] 00:19:28.661 [2024-07-15 10:27:05.708204] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:28.661 [2024-07-15 10:27:05.815417] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:28.920 [2024-07-15 10:27:05.879581] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:28.920 [2024-07-15 10:27:05.879615] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:29.856 10:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:30.116 malloc1 00:19:30.116 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:30.375 [2024-07-15 10:27:07.455228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:30.375 [2024-07-15 10:27:07.455272] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:30.375 [2024-07-15 10:27:07.455293] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb8570 00:19:30.375 [2024-07-15 10:27:07.455306] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:30.375 [2024-07-15 10:27:07.456857] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:30.375 [2024-07-15 10:27:07.456887] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:30.375 pt1 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:30.375 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:30.632 malloc2 00:19:30.632 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:30.889 [2024-07-15 10:27:07.954573] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:30.889 [2024-07-15 10:27:07.954621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:30.889 [2024-07-15 10:27:07.954640] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb9970 00:19:30.889 [2024-07-15 10:27:07.954653] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:30.889 [2024-07-15 10:27:07.956315] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:30.889 [2024-07-15 10:27:07.956345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:30.889 pt2 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:30.889 10:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:31.146 malloc3 00:19:31.146 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:31.403 [2024-07-15 10:27:08.448510] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:31.403 [2024-07-15 10:27:08.448557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.403 [2024-07-15 10:27:08.448576] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd50340 00:19:31.403 [2024-07-15 10:27:08.448590] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.403 [2024-07-15 10:27:08.450142] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.403 [2024-07-15 10:27:08.450171] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:31.403 pt3 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:31.403 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:31.660 malloc4 00:19:31.660 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:31.918 [2024-07-15 10:27:08.946477] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:31.918 [2024-07-15 10:27:08.946527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.918 [2024-07-15 10:27:08.946549] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd52c60 00:19:31.918 [2024-07-15 10:27:08.946562] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.918 [2024-07-15 10:27:08.948142] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.918 [2024-07-15 10:27:08.948171] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:31.918 pt4 00:19:31.918 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:31.918 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:31.918 10:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:32.176 [2024-07-15 10:27:09.187149] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:32.176 [2024-07-15 10:27:09.188516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:32.176 [2024-07-15 10:27:09.188573] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:32.176 [2024-07-15 10:27:09.188623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:32.176 [2024-07-15 10:27:09.188795] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbb0530 00:19:32.176 [2024-07-15 10:27:09.188806] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:32.176 [2024-07-15 10:27:09.189029] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbae770 00:19:32.176 [2024-07-15 10:27:09.189181] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbb0530 00:19:32.176 [2024-07-15 10:27:09.189191] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbb0530 00:19:32.176 [2024-07-15 10:27:09.189295] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.176 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.177 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.177 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.177 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.177 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:32.435 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.435 "name": "raid_bdev1", 00:19:32.435 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:32.435 "strip_size_kb": 64, 00:19:32.435 "state": "online", 00:19:32.435 "raid_level": "raid0", 00:19:32.435 "superblock": true, 00:19:32.435 "num_base_bdevs": 4, 00:19:32.435 "num_base_bdevs_discovered": 4, 00:19:32.435 "num_base_bdevs_operational": 4, 00:19:32.435 "base_bdevs_list": [ 00:19:32.435 { 00:19:32.435 "name": "pt1", 00:19:32.435 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:32.435 "is_configured": true, 00:19:32.435 "data_offset": 2048, 00:19:32.435 "data_size": 63488 00:19:32.435 }, 00:19:32.435 { 00:19:32.435 "name": "pt2", 00:19:32.435 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:32.435 "is_configured": true, 00:19:32.435 "data_offset": 2048, 00:19:32.435 "data_size": 63488 00:19:32.435 }, 00:19:32.435 { 00:19:32.435 "name": "pt3", 00:19:32.435 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:32.435 "is_configured": true, 00:19:32.435 "data_offset": 2048, 00:19:32.435 "data_size": 63488 00:19:32.435 }, 00:19:32.435 { 00:19:32.435 "name": "pt4", 00:19:32.435 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:32.435 "is_configured": true, 00:19:32.435 "data_offset": 2048, 00:19:32.435 "data_size": 63488 00:19:32.435 } 00:19:32.435 ] 00:19:32.435 }' 00:19:32.435 10:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.435 10:27:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:33.000 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:33.258 [2024-07-15 10:27:10.214146] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:33.258 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:33.258 "name": "raid_bdev1", 00:19:33.258 "aliases": [ 00:19:33.258 "0287b0ec-f93d-44ed-b7cb-2eabea922bb0" 00:19:33.258 ], 00:19:33.258 "product_name": "Raid Volume", 00:19:33.258 "block_size": 512, 00:19:33.258 "num_blocks": 253952, 00:19:33.258 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:33.258 "assigned_rate_limits": { 00:19:33.258 "rw_ios_per_sec": 0, 00:19:33.258 "rw_mbytes_per_sec": 0, 00:19:33.258 "r_mbytes_per_sec": 0, 00:19:33.258 "w_mbytes_per_sec": 0 00:19:33.258 }, 00:19:33.258 "claimed": false, 00:19:33.258 "zoned": false, 00:19:33.258 "supported_io_types": { 00:19:33.258 "read": true, 00:19:33.258 "write": true, 00:19:33.258 "unmap": true, 00:19:33.258 "flush": true, 00:19:33.258 "reset": true, 00:19:33.258 "nvme_admin": false, 00:19:33.258 "nvme_io": false, 00:19:33.258 "nvme_io_md": false, 00:19:33.258 "write_zeroes": true, 00:19:33.258 "zcopy": false, 00:19:33.258 "get_zone_info": false, 00:19:33.258 "zone_management": false, 00:19:33.258 "zone_append": false, 00:19:33.258 "compare": false, 00:19:33.258 "compare_and_write": false, 00:19:33.258 "abort": false, 00:19:33.258 "seek_hole": false, 00:19:33.258 "seek_data": false, 00:19:33.258 "copy": false, 00:19:33.258 "nvme_iov_md": false 00:19:33.258 }, 00:19:33.258 "memory_domains": [ 00:19:33.258 { 00:19:33.258 "dma_device_id": "system", 00:19:33.258 "dma_device_type": 1 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.258 "dma_device_type": 2 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "dma_device_id": "system", 00:19:33.258 "dma_device_type": 1 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.258 "dma_device_type": 2 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "dma_device_id": "system", 00:19:33.258 "dma_device_type": 1 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.258 "dma_device_type": 2 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "dma_device_id": "system", 00:19:33.258 "dma_device_type": 1 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.258 "dma_device_type": 2 00:19:33.258 } 00:19:33.258 ], 00:19:33.258 "driver_specific": { 00:19:33.258 "raid": { 00:19:33.258 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:33.258 "strip_size_kb": 64, 00:19:33.258 "state": "online", 00:19:33.258 "raid_level": "raid0", 00:19:33.258 "superblock": true, 00:19:33.258 "num_base_bdevs": 4, 00:19:33.258 "num_base_bdevs_discovered": 4, 00:19:33.258 "num_base_bdevs_operational": 4, 00:19:33.258 "base_bdevs_list": [ 00:19:33.258 { 00:19:33.258 "name": "pt1", 00:19:33.258 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:33.258 "is_configured": true, 00:19:33.258 "data_offset": 2048, 00:19:33.258 "data_size": 63488 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "name": "pt2", 00:19:33.258 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:33.258 "is_configured": true, 00:19:33.258 "data_offset": 2048, 00:19:33.258 "data_size": 63488 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "name": "pt3", 00:19:33.258 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:33.258 "is_configured": true, 00:19:33.258 "data_offset": 2048, 00:19:33.258 "data_size": 63488 00:19:33.258 }, 00:19:33.258 { 00:19:33.258 "name": "pt4", 00:19:33.258 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:33.258 "is_configured": true, 00:19:33.258 "data_offset": 2048, 00:19:33.258 "data_size": 63488 00:19:33.258 } 00:19:33.258 ] 00:19:33.258 } 00:19:33.258 } 00:19:33.258 }' 00:19:33.258 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:33.258 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:33.258 pt2 00:19:33.258 pt3 00:19:33.258 pt4' 00:19:33.258 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:33.258 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:33.258 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:33.517 "name": "pt1", 00:19:33.517 "aliases": [ 00:19:33.517 "00000000-0000-0000-0000-000000000001" 00:19:33.517 ], 00:19:33.517 "product_name": "passthru", 00:19:33.517 "block_size": 512, 00:19:33.517 "num_blocks": 65536, 00:19:33.517 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:33.517 "assigned_rate_limits": { 00:19:33.517 "rw_ios_per_sec": 0, 00:19:33.517 "rw_mbytes_per_sec": 0, 00:19:33.517 "r_mbytes_per_sec": 0, 00:19:33.517 "w_mbytes_per_sec": 0 00:19:33.517 }, 00:19:33.517 "claimed": true, 00:19:33.517 "claim_type": "exclusive_write", 00:19:33.517 "zoned": false, 00:19:33.517 "supported_io_types": { 00:19:33.517 "read": true, 00:19:33.517 "write": true, 00:19:33.517 "unmap": true, 00:19:33.517 "flush": true, 00:19:33.517 "reset": true, 00:19:33.517 "nvme_admin": false, 00:19:33.517 "nvme_io": false, 00:19:33.517 "nvme_io_md": false, 00:19:33.517 "write_zeroes": true, 00:19:33.517 "zcopy": true, 00:19:33.517 "get_zone_info": false, 00:19:33.517 "zone_management": false, 00:19:33.517 "zone_append": false, 00:19:33.517 "compare": false, 00:19:33.517 "compare_and_write": false, 00:19:33.517 "abort": true, 00:19:33.517 "seek_hole": false, 00:19:33.517 "seek_data": false, 00:19:33.517 "copy": true, 00:19:33.517 "nvme_iov_md": false 00:19:33.517 }, 00:19:33.517 "memory_domains": [ 00:19:33.517 { 00:19:33.517 "dma_device_id": "system", 00:19:33.517 "dma_device_type": 1 00:19:33.517 }, 00:19:33.517 { 00:19:33.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.517 "dma_device_type": 2 00:19:33.517 } 00:19:33.517 ], 00:19:33.517 "driver_specific": { 00:19:33.517 "passthru": { 00:19:33.517 "name": "pt1", 00:19:33.517 "base_bdev_name": "malloc1" 00:19:33.517 } 00:19:33.517 } 00:19:33.517 }' 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:33.517 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:33.774 10:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:34.032 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:34.032 "name": "pt2", 00:19:34.032 "aliases": [ 00:19:34.032 "00000000-0000-0000-0000-000000000002" 00:19:34.032 ], 00:19:34.032 "product_name": "passthru", 00:19:34.032 "block_size": 512, 00:19:34.032 "num_blocks": 65536, 00:19:34.032 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:34.032 "assigned_rate_limits": { 00:19:34.032 "rw_ios_per_sec": 0, 00:19:34.032 "rw_mbytes_per_sec": 0, 00:19:34.032 "r_mbytes_per_sec": 0, 00:19:34.032 "w_mbytes_per_sec": 0 00:19:34.032 }, 00:19:34.032 "claimed": true, 00:19:34.032 "claim_type": "exclusive_write", 00:19:34.032 "zoned": false, 00:19:34.032 "supported_io_types": { 00:19:34.032 "read": true, 00:19:34.032 "write": true, 00:19:34.032 "unmap": true, 00:19:34.032 "flush": true, 00:19:34.032 "reset": true, 00:19:34.032 "nvme_admin": false, 00:19:34.032 "nvme_io": false, 00:19:34.032 "nvme_io_md": false, 00:19:34.032 "write_zeroes": true, 00:19:34.032 "zcopy": true, 00:19:34.032 "get_zone_info": false, 00:19:34.032 "zone_management": false, 00:19:34.032 "zone_append": false, 00:19:34.032 "compare": false, 00:19:34.032 "compare_and_write": false, 00:19:34.032 "abort": true, 00:19:34.032 "seek_hole": false, 00:19:34.032 "seek_data": false, 00:19:34.032 "copy": true, 00:19:34.032 "nvme_iov_md": false 00:19:34.032 }, 00:19:34.032 "memory_domains": [ 00:19:34.032 { 00:19:34.032 "dma_device_id": "system", 00:19:34.032 "dma_device_type": 1 00:19:34.032 }, 00:19:34.032 { 00:19:34.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.032 "dma_device_type": 2 00:19:34.032 } 00:19:34.032 ], 00:19:34.032 "driver_specific": { 00:19:34.032 "passthru": { 00:19:34.032 "name": "pt2", 00:19:34.032 "base_bdev_name": "malloc2" 00:19:34.032 } 00:19:34.032 } 00:19:34.032 }' 00:19:34.032 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.032 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.032 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:34.032 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:34.289 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:34.289 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:34.289 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:34.289 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:34.289 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:34.289 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:34.289 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:34.546 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:34.546 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:34.546 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:34.546 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:34.546 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:34.546 "name": "pt3", 00:19:34.546 "aliases": [ 00:19:34.546 "00000000-0000-0000-0000-000000000003" 00:19:34.546 ], 00:19:34.546 "product_name": "passthru", 00:19:34.546 "block_size": 512, 00:19:34.546 "num_blocks": 65536, 00:19:34.546 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:34.546 "assigned_rate_limits": { 00:19:34.546 "rw_ios_per_sec": 0, 00:19:34.546 "rw_mbytes_per_sec": 0, 00:19:34.546 "r_mbytes_per_sec": 0, 00:19:34.546 "w_mbytes_per_sec": 0 00:19:34.546 }, 00:19:34.546 "claimed": true, 00:19:34.546 "claim_type": "exclusive_write", 00:19:34.546 "zoned": false, 00:19:34.546 "supported_io_types": { 00:19:34.546 "read": true, 00:19:34.546 "write": true, 00:19:34.546 "unmap": true, 00:19:34.546 "flush": true, 00:19:34.546 "reset": true, 00:19:34.546 "nvme_admin": false, 00:19:34.546 "nvme_io": false, 00:19:34.546 "nvme_io_md": false, 00:19:34.546 "write_zeroes": true, 00:19:34.546 "zcopy": true, 00:19:34.546 "get_zone_info": false, 00:19:34.546 "zone_management": false, 00:19:34.546 "zone_append": false, 00:19:34.547 "compare": false, 00:19:34.547 "compare_and_write": false, 00:19:34.547 "abort": true, 00:19:34.547 "seek_hole": false, 00:19:34.547 "seek_data": false, 00:19:34.547 "copy": true, 00:19:34.547 "nvme_iov_md": false 00:19:34.547 }, 00:19:34.547 "memory_domains": [ 00:19:34.547 { 00:19:34.547 "dma_device_id": "system", 00:19:34.547 "dma_device_type": 1 00:19:34.547 }, 00:19:34.547 { 00:19:34.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.547 "dma_device_type": 2 00:19:34.547 } 00:19:34.547 ], 00:19:34.547 "driver_specific": { 00:19:34.547 "passthru": { 00:19:34.547 "name": "pt3", 00:19:34.547 "base_bdev_name": "malloc3" 00:19:34.547 } 00:19:34.547 } 00:19:34.547 }' 00:19:34.547 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.804 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:34.804 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:34.804 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:34.804 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:34.804 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:34.804 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:34.804 10:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.061 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:35.061 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.061 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.061 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:35.061 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:35.061 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:35.061 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:35.318 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:35.318 "name": "pt4", 00:19:35.318 "aliases": [ 00:19:35.318 "00000000-0000-0000-0000-000000000004" 00:19:35.318 ], 00:19:35.318 "product_name": "passthru", 00:19:35.318 "block_size": 512, 00:19:35.318 "num_blocks": 65536, 00:19:35.318 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:35.318 "assigned_rate_limits": { 00:19:35.318 "rw_ios_per_sec": 0, 00:19:35.318 "rw_mbytes_per_sec": 0, 00:19:35.318 "r_mbytes_per_sec": 0, 00:19:35.318 "w_mbytes_per_sec": 0 00:19:35.318 }, 00:19:35.318 "claimed": true, 00:19:35.318 "claim_type": "exclusive_write", 00:19:35.318 "zoned": false, 00:19:35.318 "supported_io_types": { 00:19:35.318 "read": true, 00:19:35.318 "write": true, 00:19:35.318 "unmap": true, 00:19:35.318 "flush": true, 00:19:35.318 "reset": true, 00:19:35.318 "nvme_admin": false, 00:19:35.318 "nvme_io": false, 00:19:35.318 "nvme_io_md": false, 00:19:35.318 "write_zeroes": true, 00:19:35.318 "zcopy": true, 00:19:35.318 "get_zone_info": false, 00:19:35.318 "zone_management": false, 00:19:35.318 "zone_append": false, 00:19:35.318 "compare": false, 00:19:35.318 "compare_and_write": false, 00:19:35.318 "abort": true, 00:19:35.318 "seek_hole": false, 00:19:35.318 "seek_data": false, 00:19:35.318 "copy": true, 00:19:35.318 "nvme_iov_md": false 00:19:35.318 }, 00:19:35.318 "memory_domains": [ 00:19:35.319 { 00:19:35.319 "dma_device_id": "system", 00:19:35.319 "dma_device_type": 1 00:19:35.319 }, 00:19:35.319 { 00:19:35.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.319 "dma_device_type": 2 00:19:35.319 } 00:19:35.319 ], 00:19:35.319 "driver_specific": { 00:19:35.319 "passthru": { 00:19:35.319 "name": "pt4", 00:19:35.319 "base_bdev_name": "malloc4" 00:19:35.319 } 00:19:35.319 } 00:19:35.319 }' 00:19:35.319 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.319 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:35.319 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:35.319 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.319 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:35.319 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:35.319 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.576 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:35.576 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:35.576 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.576 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:35.576 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:35.576 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:35.576 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:35.834 [2024-07-15 10:27:12.905283] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:35.834 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=0287b0ec-f93d-44ed-b7cb-2eabea922bb0 00:19:35.834 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 0287b0ec-f93d-44ed-b7cb-2eabea922bb0 ']' 00:19:35.834 10:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:36.092 [2024-07-15 10:27:13.149605] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:36.092 [2024-07-15 10:27:13.149628] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:36.092 [2024-07-15 10:27:13.149679] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:36.092 [2024-07-15 10:27:13.149745] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:36.092 [2024-07-15 10:27:13.149757] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb0530 name raid_bdev1, state offline 00:19:36.092 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.092 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:36.349 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:36.349 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:36.349 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:36.349 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:36.912 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:36.912 10:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:37.169 10:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:37.169 10:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:37.733 10:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:37.733 10:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:37.733 10:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:37.733 10:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:37.990 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:37.990 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:37.991 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:38.248 [2024-07-15 10:27:15.399445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:38.248 [2024-07-15 10:27:15.400831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:38.248 [2024-07-15 10:27:15.400872] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:38.248 [2024-07-15 10:27:15.400906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:38.248 [2024-07-15 10:27:15.400956] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:38.248 [2024-07-15 10:27:15.400995] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:38.248 [2024-07-15 10:27:15.401018] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:38.248 [2024-07-15 10:27:15.401040] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:38.248 [2024-07-15 10:27:15.401065] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:38.248 [2024-07-15 10:27:15.401077] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd5bff0 name raid_bdev1, state configuring 00:19:38.248 request: 00:19:38.248 { 00:19:38.248 "name": "raid_bdev1", 00:19:38.248 "raid_level": "raid0", 00:19:38.248 "base_bdevs": [ 00:19:38.248 "malloc1", 00:19:38.248 "malloc2", 00:19:38.248 "malloc3", 00:19:38.248 "malloc4" 00:19:38.248 ], 00:19:38.248 "strip_size_kb": 64, 00:19:38.248 "superblock": false, 00:19:38.248 "method": "bdev_raid_create", 00:19:38.248 "req_id": 1 00:19:38.248 } 00:19:38.248 Got JSON-RPC error response 00:19:38.248 response: 00:19:38.248 { 00:19:38.248 "code": -17, 00:19:38.248 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:38.248 } 00:19:38.248 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:38.248 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:38.248 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:38.248 10:27:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:38.248 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.248 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:38.506 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:38.506 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:38.506 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:38.764 [2024-07-15 10:27:15.892687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:38.764 [2024-07-15 10:27:15.892732] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:38.764 [2024-07-15 10:27:15.892753] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb87a0 00:19:38.764 [2024-07-15 10:27:15.892766] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:38.764 [2024-07-15 10:27:15.894366] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:38.764 [2024-07-15 10:27:15.894395] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:38.764 [2024-07-15 10:27:15.894461] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:38.764 [2024-07-15 10:27:15.894486] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:38.764 pt1 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.764 10:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:39.022 10:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.022 "name": "raid_bdev1", 00:19:39.022 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:39.022 "strip_size_kb": 64, 00:19:39.022 "state": "configuring", 00:19:39.022 "raid_level": "raid0", 00:19:39.022 "superblock": true, 00:19:39.022 "num_base_bdevs": 4, 00:19:39.022 "num_base_bdevs_discovered": 1, 00:19:39.022 "num_base_bdevs_operational": 4, 00:19:39.022 "base_bdevs_list": [ 00:19:39.022 { 00:19:39.022 "name": "pt1", 00:19:39.022 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:39.022 "is_configured": true, 00:19:39.022 "data_offset": 2048, 00:19:39.022 "data_size": 63488 00:19:39.022 }, 00:19:39.022 { 00:19:39.022 "name": null, 00:19:39.022 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:39.022 "is_configured": false, 00:19:39.022 "data_offset": 2048, 00:19:39.022 "data_size": 63488 00:19:39.022 }, 00:19:39.022 { 00:19:39.022 "name": null, 00:19:39.022 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:39.022 "is_configured": false, 00:19:39.022 "data_offset": 2048, 00:19:39.022 "data_size": 63488 00:19:39.022 }, 00:19:39.022 { 00:19:39.022 "name": null, 00:19:39.022 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:39.022 "is_configured": false, 00:19:39.022 "data_offset": 2048, 00:19:39.022 "data_size": 63488 00:19:39.022 } 00:19:39.022 ] 00:19:39.022 }' 00:19:39.022 10:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.022 10:27:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:39.588 10:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:39.588 10:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:39.845 [2024-07-15 10:27:16.975569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:39.845 [2024-07-15 10:27:16.975616] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:39.845 [2024-07-15 10:27:16.975636] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd51940 00:19:39.845 [2024-07-15 10:27:16.975649] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:39.845 [2024-07-15 10:27:16.976020] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:39.845 [2024-07-15 10:27:16.976041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:39.845 [2024-07-15 10:27:16.976102] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:39.845 [2024-07-15 10:27:16.976122] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:39.845 pt2 00:19:39.846 10:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:40.104 [2024-07-15 10:27:17.220239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.104 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.361 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.361 "name": "raid_bdev1", 00:19:40.361 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:40.361 "strip_size_kb": 64, 00:19:40.361 "state": "configuring", 00:19:40.361 "raid_level": "raid0", 00:19:40.361 "superblock": true, 00:19:40.361 "num_base_bdevs": 4, 00:19:40.361 "num_base_bdevs_discovered": 1, 00:19:40.361 "num_base_bdevs_operational": 4, 00:19:40.361 "base_bdevs_list": [ 00:19:40.361 { 00:19:40.361 "name": "pt1", 00:19:40.361 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:40.361 "is_configured": true, 00:19:40.361 "data_offset": 2048, 00:19:40.361 "data_size": 63488 00:19:40.361 }, 00:19:40.361 { 00:19:40.361 "name": null, 00:19:40.361 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:40.361 "is_configured": false, 00:19:40.361 "data_offset": 2048, 00:19:40.361 "data_size": 63488 00:19:40.361 }, 00:19:40.361 { 00:19:40.361 "name": null, 00:19:40.361 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:40.361 "is_configured": false, 00:19:40.361 "data_offset": 2048, 00:19:40.361 "data_size": 63488 00:19:40.361 }, 00:19:40.361 { 00:19:40.361 "name": null, 00:19:40.361 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:40.361 "is_configured": false, 00:19:40.361 "data_offset": 2048, 00:19:40.361 "data_size": 63488 00:19:40.361 } 00:19:40.361 ] 00:19:40.361 }' 00:19:40.361 10:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.361 10:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.926 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:40.926 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:40.926 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:41.183 [2024-07-15 10:27:18.315128] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:41.183 [2024-07-15 10:27:18.315179] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.183 [2024-07-15 10:27:18.315198] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbaf060 00:19:41.183 [2024-07-15 10:27:18.315211] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.183 [2024-07-15 10:27:18.315549] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.183 [2024-07-15 10:27:18.315569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:41.183 [2024-07-15 10:27:18.315630] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:41.183 [2024-07-15 10:27:18.315649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:41.183 pt2 00:19:41.183 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:41.183 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:41.183 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:41.441 [2024-07-15 10:27:18.559781] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:41.441 [2024-07-15 10:27:18.559819] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.441 [2024-07-15 10:27:18.559839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb18d0 00:19:41.441 [2024-07-15 10:27:18.559851] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.441 [2024-07-15 10:27:18.560159] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.441 [2024-07-15 10:27:18.560177] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:41.441 [2024-07-15 10:27:18.560230] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:41.441 [2024-07-15 10:27:18.560247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:41.441 pt3 00:19:41.441 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:41.441 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:41.441 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:41.699 [2024-07-15 10:27:18.804433] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:41.699 [2024-07-15 10:27:18.804474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:41.699 [2024-07-15 10:27:18.804491] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbb2b80 00:19:41.699 [2024-07-15 10:27:18.804508] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:41.699 [2024-07-15 10:27:18.804812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:41.699 [2024-07-15 10:27:18.804831] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:41.699 [2024-07-15 10:27:18.804884] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:41.699 [2024-07-15 10:27:18.804902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:41.699 [2024-07-15 10:27:18.805031] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbaf780 00:19:41.699 [2024-07-15 10:27:18.805042] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:41.699 [2024-07-15 10:27:18.805214] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb4d70 00:19:41.699 [2024-07-15 10:27:18.805340] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbaf780 00:19:41.699 [2024-07-15 10:27:18.805350] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbaf780 00:19:41.699 [2024-07-15 10:27:18.805446] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:41.699 pt4 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.699 10:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.958 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.958 "name": "raid_bdev1", 00:19:41.958 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:41.958 "strip_size_kb": 64, 00:19:41.958 "state": "online", 00:19:41.958 "raid_level": "raid0", 00:19:41.958 "superblock": true, 00:19:41.958 "num_base_bdevs": 4, 00:19:41.958 "num_base_bdevs_discovered": 4, 00:19:41.958 "num_base_bdevs_operational": 4, 00:19:41.958 "base_bdevs_list": [ 00:19:41.958 { 00:19:41.958 "name": "pt1", 00:19:41.958 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:41.958 "is_configured": true, 00:19:41.958 "data_offset": 2048, 00:19:41.958 "data_size": 63488 00:19:41.958 }, 00:19:41.958 { 00:19:41.958 "name": "pt2", 00:19:41.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:41.958 "is_configured": true, 00:19:41.958 "data_offset": 2048, 00:19:41.958 "data_size": 63488 00:19:41.958 }, 00:19:41.958 { 00:19:41.958 "name": "pt3", 00:19:41.958 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:41.958 "is_configured": true, 00:19:41.958 "data_offset": 2048, 00:19:41.958 "data_size": 63488 00:19:41.958 }, 00:19:41.958 { 00:19:41.958 "name": "pt4", 00:19:41.958 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:41.958 "is_configured": true, 00:19:41.958 "data_offset": 2048, 00:19:41.958 "data_size": 63488 00:19:41.958 } 00:19:41.958 ] 00:19:41.958 }' 00:19:41.958 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.958 10:27:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:42.524 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:42.782 [2024-07-15 10:27:19.899655] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:42.782 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:42.782 "name": "raid_bdev1", 00:19:42.782 "aliases": [ 00:19:42.782 "0287b0ec-f93d-44ed-b7cb-2eabea922bb0" 00:19:42.782 ], 00:19:42.782 "product_name": "Raid Volume", 00:19:42.782 "block_size": 512, 00:19:42.782 "num_blocks": 253952, 00:19:42.782 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:42.782 "assigned_rate_limits": { 00:19:42.782 "rw_ios_per_sec": 0, 00:19:42.782 "rw_mbytes_per_sec": 0, 00:19:42.782 "r_mbytes_per_sec": 0, 00:19:42.782 "w_mbytes_per_sec": 0 00:19:42.782 }, 00:19:42.782 "claimed": false, 00:19:42.782 "zoned": false, 00:19:42.782 "supported_io_types": { 00:19:42.782 "read": true, 00:19:42.782 "write": true, 00:19:42.782 "unmap": true, 00:19:42.782 "flush": true, 00:19:42.782 "reset": true, 00:19:42.782 "nvme_admin": false, 00:19:42.782 "nvme_io": false, 00:19:42.782 "nvme_io_md": false, 00:19:42.782 "write_zeroes": true, 00:19:42.782 "zcopy": false, 00:19:42.782 "get_zone_info": false, 00:19:42.782 "zone_management": false, 00:19:42.782 "zone_append": false, 00:19:42.782 "compare": false, 00:19:42.782 "compare_and_write": false, 00:19:42.782 "abort": false, 00:19:42.782 "seek_hole": false, 00:19:42.782 "seek_data": false, 00:19:42.782 "copy": false, 00:19:42.782 "nvme_iov_md": false 00:19:42.782 }, 00:19:42.782 "memory_domains": [ 00:19:42.782 { 00:19:42.782 "dma_device_id": "system", 00:19:42.782 "dma_device_type": 1 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.782 "dma_device_type": 2 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "dma_device_id": "system", 00:19:42.782 "dma_device_type": 1 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.782 "dma_device_type": 2 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "dma_device_id": "system", 00:19:42.782 "dma_device_type": 1 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.782 "dma_device_type": 2 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "dma_device_id": "system", 00:19:42.782 "dma_device_type": 1 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.782 "dma_device_type": 2 00:19:42.782 } 00:19:42.782 ], 00:19:42.782 "driver_specific": { 00:19:42.782 "raid": { 00:19:42.782 "uuid": "0287b0ec-f93d-44ed-b7cb-2eabea922bb0", 00:19:42.782 "strip_size_kb": 64, 00:19:42.782 "state": "online", 00:19:42.782 "raid_level": "raid0", 00:19:42.782 "superblock": true, 00:19:42.782 "num_base_bdevs": 4, 00:19:42.782 "num_base_bdevs_discovered": 4, 00:19:42.782 "num_base_bdevs_operational": 4, 00:19:42.782 "base_bdevs_list": [ 00:19:42.782 { 00:19:42.782 "name": "pt1", 00:19:42.782 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:42.782 "is_configured": true, 00:19:42.782 "data_offset": 2048, 00:19:42.782 "data_size": 63488 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "name": "pt2", 00:19:42.782 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:42.782 "is_configured": true, 00:19:42.782 "data_offset": 2048, 00:19:42.782 "data_size": 63488 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "name": "pt3", 00:19:42.782 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:42.782 "is_configured": true, 00:19:42.782 "data_offset": 2048, 00:19:42.782 "data_size": 63488 00:19:42.782 }, 00:19:42.782 { 00:19:42.782 "name": "pt4", 00:19:42.782 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:42.782 "is_configured": true, 00:19:42.782 "data_offset": 2048, 00:19:42.782 "data_size": 63488 00:19:42.782 } 00:19:42.782 ] 00:19:42.782 } 00:19:42.782 } 00:19:42.782 }' 00:19:42.782 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:42.782 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:42.782 pt2 00:19:42.782 pt3 00:19:42.782 pt4' 00:19:42.782 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.782 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:42.782 10:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.039 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.039 "name": "pt1", 00:19:43.039 "aliases": [ 00:19:43.039 "00000000-0000-0000-0000-000000000001" 00:19:43.039 ], 00:19:43.039 "product_name": "passthru", 00:19:43.039 "block_size": 512, 00:19:43.039 "num_blocks": 65536, 00:19:43.039 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:43.039 "assigned_rate_limits": { 00:19:43.039 "rw_ios_per_sec": 0, 00:19:43.039 "rw_mbytes_per_sec": 0, 00:19:43.039 "r_mbytes_per_sec": 0, 00:19:43.039 "w_mbytes_per_sec": 0 00:19:43.039 }, 00:19:43.039 "claimed": true, 00:19:43.039 "claim_type": "exclusive_write", 00:19:43.039 "zoned": false, 00:19:43.039 "supported_io_types": { 00:19:43.039 "read": true, 00:19:43.039 "write": true, 00:19:43.039 "unmap": true, 00:19:43.039 "flush": true, 00:19:43.039 "reset": true, 00:19:43.039 "nvme_admin": false, 00:19:43.039 "nvme_io": false, 00:19:43.039 "nvme_io_md": false, 00:19:43.039 "write_zeroes": true, 00:19:43.039 "zcopy": true, 00:19:43.039 "get_zone_info": false, 00:19:43.039 "zone_management": false, 00:19:43.039 "zone_append": false, 00:19:43.039 "compare": false, 00:19:43.039 "compare_and_write": false, 00:19:43.039 "abort": true, 00:19:43.039 "seek_hole": false, 00:19:43.039 "seek_data": false, 00:19:43.039 "copy": true, 00:19:43.039 "nvme_iov_md": false 00:19:43.039 }, 00:19:43.039 "memory_domains": [ 00:19:43.039 { 00:19:43.039 "dma_device_id": "system", 00:19:43.039 "dma_device_type": 1 00:19:43.039 }, 00:19:43.039 { 00:19:43.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.039 "dma_device_type": 2 00:19:43.039 } 00:19:43.039 ], 00:19:43.040 "driver_specific": { 00:19:43.040 "passthru": { 00:19:43.040 "name": "pt1", 00:19:43.040 "base_bdev_name": "malloc1" 00:19:43.040 } 00:19:43.040 } 00:19:43.040 }' 00:19:43.040 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.298 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.601 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.601 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.601 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.601 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:43.601 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.867 "name": "pt2", 00:19:43.867 "aliases": [ 00:19:43.867 "00000000-0000-0000-0000-000000000002" 00:19:43.867 ], 00:19:43.867 "product_name": "passthru", 00:19:43.867 "block_size": 512, 00:19:43.867 "num_blocks": 65536, 00:19:43.867 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:43.867 "assigned_rate_limits": { 00:19:43.867 "rw_ios_per_sec": 0, 00:19:43.867 "rw_mbytes_per_sec": 0, 00:19:43.867 "r_mbytes_per_sec": 0, 00:19:43.867 "w_mbytes_per_sec": 0 00:19:43.867 }, 00:19:43.867 "claimed": true, 00:19:43.867 "claim_type": "exclusive_write", 00:19:43.867 "zoned": false, 00:19:43.867 "supported_io_types": { 00:19:43.867 "read": true, 00:19:43.867 "write": true, 00:19:43.867 "unmap": true, 00:19:43.867 "flush": true, 00:19:43.867 "reset": true, 00:19:43.867 "nvme_admin": false, 00:19:43.867 "nvme_io": false, 00:19:43.867 "nvme_io_md": false, 00:19:43.867 "write_zeroes": true, 00:19:43.867 "zcopy": true, 00:19:43.867 "get_zone_info": false, 00:19:43.867 "zone_management": false, 00:19:43.867 "zone_append": false, 00:19:43.867 "compare": false, 00:19:43.867 "compare_and_write": false, 00:19:43.867 "abort": true, 00:19:43.867 "seek_hole": false, 00:19:43.867 "seek_data": false, 00:19:43.867 "copy": true, 00:19:43.867 "nvme_iov_md": false 00:19:43.867 }, 00:19:43.867 "memory_domains": [ 00:19:43.867 { 00:19:43.867 "dma_device_id": "system", 00:19:43.867 "dma_device_type": 1 00:19:43.867 }, 00:19:43.867 { 00:19:43.867 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.867 "dma_device_type": 2 00:19:43.867 } 00:19:43.867 ], 00:19:43.867 "driver_specific": { 00:19:43.867 "passthru": { 00:19:43.867 "name": "pt2", 00:19:43.867 "base_bdev_name": "malloc2" 00:19:43.867 } 00:19:43.867 } 00:19:43.867 }' 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.867 10:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.867 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.867 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.867 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.125 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.125 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.125 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:44.125 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:44.125 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.383 "name": "pt3", 00:19:44.383 "aliases": [ 00:19:44.383 "00000000-0000-0000-0000-000000000003" 00:19:44.383 ], 00:19:44.383 "product_name": "passthru", 00:19:44.383 "block_size": 512, 00:19:44.383 "num_blocks": 65536, 00:19:44.383 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:44.383 "assigned_rate_limits": { 00:19:44.383 "rw_ios_per_sec": 0, 00:19:44.383 "rw_mbytes_per_sec": 0, 00:19:44.383 "r_mbytes_per_sec": 0, 00:19:44.383 "w_mbytes_per_sec": 0 00:19:44.383 }, 00:19:44.383 "claimed": true, 00:19:44.383 "claim_type": "exclusive_write", 00:19:44.383 "zoned": false, 00:19:44.383 "supported_io_types": { 00:19:44.383 "read": true, 00:19:44.383 "write": true, 00:19:44.383 "unmap": true, 00:19:44.383 "flush": true, 00:19:44.383 "reset": true, 00:19:44.383 "nvme_admin": false, 00:19:44.383 "nvme_io": false, 00:19:44.383 "nvme_io_md": false, 00:19:44.383 "write_zeroes": true, 00:19:44.383 "zcopy": true, 00:19:44.383 "get_zone_info": false, 00:19:44.383 "zone_management": false, 00:19:44.383 "zone_append": false, 00:19:44.383 "compare": false, 00:19:44.383 "compare_and_write": false, 00:19:44.383 "abort": true, 00:19:44.383 "seek_hole": false, 00:19:44.383 "seek_data": false, 00:19:44.383 "copy": true, 00:19:44.383 "nvme_iov_md": false 00:19:44.383 }, 00:19:44.383 "memory_domains": [ 00:19:44.383 { 00:19:44.383 "dma_device_id": "system", 00:19:44.383 "dma_device_type": 1 00:19:44.383 }, 00:19:44.383 { 00:19:44.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.383 "dma_device_type": 2 00:19:44.383 } 00:19:44.383 ], 00:19:44.383 "driver_specific": { 00:19:44.383 "passthru": { 00:19:44.383 "name": "pt3", 00:19:44.383 "base_bdev_name": "malloc3" 00:19:44.383 } 00:19:44.383 } 00:19:44.383 }' 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.383 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:44.640 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.897 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.897 "name": "pt4", 00:19:44.897 "aliases": [ 00:19:44.897 "00000000-0000-0000-0000-000000000004" 00:19:44.897 ], 00:19:44.897 "product_name": "passthru", 00:19:44.897 "block_size": 512, 00:19:44.897 "num_blocks": 65536, 00:19:44.897 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:44.897 "assigned_rate_limits": { 00:19:44.897 "rw_ios_per_sec": 0, 00:19:44.897 "rw_mbytes_per_sec": 0, 00:19:44.897 "r_mbytes_per_sec": 0, 00:19:44.897 "w_mbytes_per_sec": 0 00:19:44.897 }, 00:19:44.897 "claimed": true, 00:19:44.897 "claim_type": "exclusive_write", 00:19:44.897 "zoned": false, 00:19:44.897 "supported_io_types": { 00:19:44.897 "read": true, 00:19:44.897 "write": true, 00:19:44.897 "unmap": true, 00:19:44.897 "flush": true, 00:19:44.897 "reset": true, 00:19:44.897 "nvme_admin": false, 00:19:44.897 "nvme_io": false, 00:19:44.897 "nvme_io_md": false, 00:19:44.897 "write_zeroes": true, 00:19:44.897 "zcopy": true, 00:19:44.897 "get_zone_info": false, 00:19:44.897 "zone_management": false, 00:19:44.897 "zone_append": false, 00:19:44.897 "compare": false, 00:19:44.897 "compare_and_write": false, 00:19:44.897 "abort": true, 00:19:44.897 "seek_hole": false, 00:19:44.897 "seek_data": false, 00:19:44.897 "copy": true, 00:19:44.897 "nvme_iov_md": false 00:19:44.897 }, 00:19:44.897 "memory_domains": [ 00:19:44.897 { 00:19:44.897 "dma_device_id": "system", 00:19:44.897 "dma_device_type": 1 00:19:44.897 }, 00:19:44.897 { 00:19:44.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.897 "dma_device_type": 2 00:19:44.897 } 00:19:44.897 ], 00:19:44.897 "driver_specific": { 00:19:44.897 "passthru": { 00:19:44.897 "name": "pt4", 00:19:44.897 "base_bdev_name": "malloc4" 00:19:44.897 } 00:19:44.897 } 00:19:44.897 }' 00:19:44.897 10:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.897 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.897 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.897 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:45.154 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:45.412 [2024-07-15 10:27:22.550660] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 0287b0ec-f93d-44ed-b7cb-2eabea922bb0 '!=' 0287b0ec-f93d-44ed-b7cb-2eabea922bb0 ']' 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 545414 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 545414 ']' 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 545414 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:45.412 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 545414 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 545414' 00:19:45.670 killing process with pid 545414 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 545414 00:19:45.670 [2024-07-15 10:27:22.619888] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:45.670 [2024-07-15 10:27:22.619955] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:45.670 [2024-07-15 10:27:22.620018] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:45.670 [2024-07-15 10:27:22.620030] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbaf780 name raid_bdev1, state offline 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 545414 00:19:45.670 [2024-07-15 10:27:22.655954] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:45.670 00:19:45.670 real 0m17.340s 00:19:45.670 user 0m31.335s 00:19:45.670 sys 0m3.093s 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:45.670 10:27:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.670 ************************************ 00:19:45.670 END TEST raid_superblock_test 00:19:45.670 ************************************ 00:19:45.928 10:27:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:45.928 10:27:22 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:19:45.928 10:27:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:45.928 10:27:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:45.928 10:27:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:45.928 ************************************ 00:19:45.928 START TEST raid_read_error_test 00:19:45.928 ************************************ 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.YCewynqzky 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=548054 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 548054 /var/tmp/spdk-raid.sock 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 548054 ']' 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:45.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:45.928 10:27:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.928 [2024-07-15 10:27:23.022869] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:19:45.928 [2024-07-15 10:27:23.022947] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid548054 ] 00:19:46.187 [2024-07-15 10:27:23.154519] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:46.187 [2024-07-15 10:27:23.260880] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:46.187 [2024-07-15 10:27:23.331789] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:46.187 [2024-07-15 10:27:23.331829] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:47.120 10:27:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:47.120 10:27:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:47.120 10:27:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:47.120 10:27:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:47.120 BaseBdev1_malloc 00:19:47.120 10:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:47.380 true 00:19:47.380 10:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:47.638 [2024-07-15 10:27:24.670785] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:47.638 [2024-07-15 10:27:24.670828] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.638 [2024-07-15 10:27:24.670848] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f20d0 00:19:47.638 [2024-07-15 10:27:24.670861] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.638 [2024-07-15 10:27:24.672736] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.638 [2024-07-15 10:27:24.672766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:47.638 BaseBdev1 00:19:47.638 10:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:47.638 10:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:47.896 BaseBdev2_malloc 00:19:47.896 10:27:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:48.154 true 00:19:48.154 10:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:48.412 [2024-07-15 10:27:25.394559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:48.412 [2024-07-15 10:27:25.394603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.412 [2024-07-15 10:27:25.394624] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f6910 00:19:48.412 [2024-07-15 10:27:25.394637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.412 [2024-07-15 10:27:25.396240] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.412 [2024-07-15 10:27:25.396270] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:48.412 BaseBdev2 00:19:48.412 10:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:48.413 10:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:48.670 BaseBdev3_malloc 00:19:48.670 10:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:48.929 true 00:19:48.929 10:27:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:48.929 [2024-07-15 10:27:26.122341] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:48.929 [2024-07-15 10:27:26.122386] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.929 [2024-07-15 10:27:26.122406] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f8bd0 00:19:48.929 [2024-07-15 10:27:26.122419] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.929 [2024-07-15 10:27:26.124002] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.929 [2024-07-15 10:27:26.124032] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:48.929 BaseBdev3 00:19:49.188 10:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:49.188 10:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:49.188 BaseBdev4_malloc 00:19:49.447 10:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:49.447 true 00:19:49.447 10:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:49.707 [2024-07-15 10:27:26.852834] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:49.707 [2024-07-15 10:27:26.852882] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:49.707 [2024-07-15 10:27:26.852903] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f9aa0 00:19:49.707 [2024-07-15 10:27:26.852916] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:49.707 [2024-07-15 10:27:26.854530] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:49.707 [2024-07-15 10:27:26.854562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:49.707 BaseBdev4 00:19:49.707 10:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:49.997 [2024-07-15 10:27:27.093509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:49.997 [2024-07-15 10:27:27.094888] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:49.997 [2024-07-15 10:27:27.094964] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:49.997 [2024-07-15 10:27:27.095026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:49.997 [2024-07-15 10:27:27.095256] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f3c20 00:19:49.997 [2024-07-15 10:27:27.095268] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:49.997 [2024-07-15 10:27:27.095474] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f48260 00:19:49.997 [2024-07-15 10:27:27.095630] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f3c20 00:19:49.997 [2024-07-15 10:27:27.095640] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f3c20 00:19:49.997 [2024-07-15 10:27:27.095747] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.997 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:50.256 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.256 "name": "raid_bdev1", 00:19:50.256 "uuid": "e64bb729-9f99-4512-b4ed-8ddfd911f3aa", 00:19:50.256 "strip_size_kb": 64, 00:19:50.256 "state": "online", 00:19:50.256 "raid_level": "raid0", 00:19:50.256 "superblock": true, 00:19:50.256 "num_base_bdevs": 4, 00:19:50.256 "num_base_bdevs_discovered": 4, 00:19:50.256 "num_base_bdevs_operational": 4, 00:19:50.256 "base_bdevs_list": [ 00:19:50.256 { 00:19:50.256 "name": "BaseBdev1", 00:19:50.256 "uuid": "bdd86dc6-118a-505a-a6cf-98c31dd7829b", 00:19:50.256 "is_configured": true, 00:19:50.256 "data_offset": 2048, 00:19:50.256 "data_size": 63488 00:19:50.256 }, 00:19:50.256 { 00:19:50.256 "name": "BaseBdev2", 00:19:50.256 "uuid": "3181174e-f656-5900-b936-2c11af96bfb1", 00:19:50.256 "is_configured": true, 00:19:50.256 "data_offset": 2048, 00:19:50.256 "data_size": 63488 00:19:50.256 }, 00:19:50.256 { 00:19:50.256 "name": "BaseBdev3", 00:19:50.256 "uuid": "c624f5d2-11d2-5af1-962e-bbc76331c7c2", 00:19:50.256 "is_configured": true, 00:19:50.256 "data_offset": 2048, 00:19:50.256 "data_size": 63488 00:19:50.256 }, 00:19:50.256 { 00:19:50.256 "name": "BaseBdev4", 00:19:50.256 "uuid": "837b0d91-6b7c-51bd-b0be-97fef8835205", 00:19:50.256 "is_configured": true, 00:19:50.256 "data_offset": 2048, 00:19:50.256 "data_size": 63488 00:19:50.256 } 00:19:50.256 ] 00:19:50.256 }' 00:19:50.256 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.256 10:27:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.824 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:50.824 10:27:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:50.824 [2024-07-15 10:27:27.996174] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e5fc0 00:19:51.760 10:27:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.018 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:52.276 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.276 "name": "raid_bdev1", 00:19:52.276 "uuid": "e64bb729-9f99-4512-b4ed-8ddfd911f3aa", 00:19:52.276 "strip_size_kb": 64, 00:19:52.276 "state": "online", 00:19:52.276 "raid_level": "raid0", 00:19:52.276 "superblock": true, 00:19:52.276 "num_base_bdevs": 4, 00:19:52.276 "num_base_bdevs_discovered": 4, 00:19:52.276 "num_base_bdevs_operational": 4, 00:19:52.276 "base_bdevs_list": [ 00:19:52.276 { 00:19:52.276 "name": "BaseBdev1", 00:19:52.276 "uuid": "bdd86dc6-118a-505a-a6cf-98c31dd7829b", 00:19:52.276 "is_configured": true, 00:19:52.276 "data_offset": 2048, 00:19:52.276 "data_size": 63488 00:19:52.276 }, 00:19:52.276 { 00:19:52.276 "name": "BaseBdev2", 00:19:52.276 "uuid": "3181174e-f656-5900-b936-2c11af96bfb1", 00:19:52.276 "is_configured": true, 00:19:52.276 "data_offset": 2048, 00:19:52.276 "data_size": 63488 00:19:52.276 }, 00:19:52.276 { 00:19:52.276 "name": "BaseBdev3", 00:19:52.276 "uuid": "c624f5d2-11d2-5af1-962e-bbc76331c7c2", 00:19:52.276 "is_configured": true, 00:19:52.276 "data_offset": 2048, 00:19:52.276 "data_size": 63488 00:19:52.276 }, 00:19:52.276 { 00:19:52.276 "name": "BaseBdev4", 00:19:52.276 "uuid": "837b0d91-6b7c-51bd-b0be-97fef8835205", 00:19:52.276 "is_configured": true, 00:19:52.276 "data_offset": 2048, 00:19:52.276 "data_size": 63488 00:19:52.276 } 00:19:52.276 ] 00:19:52.276 }' 00:19:52.276 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.276 10:27:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.843 10:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:53.102 [2024-07-15 10:27:30.156315] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:53.102 [2024-07-15 10:27:30.156352] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:53.102 [2024-07-15 10:27:30.159520] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:53.102 [2024-07-15 10:27:30.159558] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:53.102 [2024-07-15 10:27:30.159599] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:53.102 [2024-07-15 10:27:30.159610] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f3c20 name raid_bdev1, state offline 00:19:53.102 0 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 548054 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 548054 ']' 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 548054 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 548054 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 548054' 00:19:53.102 killing process with pid 548054 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 548054 00:19:53.102 [2024-07-15 10:27:30.226096] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:53.102 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 548054 00:19:53.102 [2024-07-15 10:27:30.256461] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.YCewynqzky 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:19:53.361 00:19:53.361 real 0m7.540s 00:19:53.361 user 0m12.039s 00:19:53.361 sys 0m1.343s 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:53.361 10:27:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.361 ************************************ 00:19:53.361 END TEST raid_read_error_test 00:19:53.361 ************************************ 00:19:53.361 10:27:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:53.361 10:27:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:53.361 10:27:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:53.361 10:27:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:53.361 10:27:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:53.620 ************************************ 00:19:53.620 START TEST raid_write_error_test 00:19:53.620 ************************************ 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZGmiQZ5jbZ 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=549198 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 549198 /var/tmp/spdk-raid.sock 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 549198 ']' 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:53.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:53.620 10:27:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.620 [2024-07-15 10:27:30.647733] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:19:53.620 [2024-07-15 10:27:30.647801] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid549198 ] 00:19:53.620 [2024-07-15 10:27:30.769044] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:53.879 [2024-07-15 10:27:30.875302] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:53.879 [2024-07-15 10:27:30.941050] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:53.879 [2024-07-15 10:27:30.941087] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:54.446 10:27:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:54.446 10:27:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:54.446 10:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:54.446 10:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:54.705 BaseBdev1_malloc 00:19:54.705 10:27:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:54.963 true 00:19:54.963 10:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:55.221 [2024-07-15 10:27:32.335183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:55.221 [2024-07-15 10:27:32.335230] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.221 [2024-07-15 10:27:32.335252] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe290d0 00:19:55.221 [2024-07-15 10:27:32.335265] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.221 [2024-07-15 10:27:32.337166] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.221 [2024-07-15 10:27:32.337198] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:55.221 BaseBdev1 00:19:55.221 10:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:55.222 10:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:55.480 BaseBdev2_malloc 00:19:55.480 10:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:55.739 true 00:19:55.739 10:27:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:55.997 [2024-07-15 10:27:33.070997] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:55.997 [2024-07-15 10:27:33.071041] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:55.997 [2024-07-15 10:27:33.071063] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2d910 00:19:55.997 [2024-07-15 10:27:33.071075] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:55.997 [2024-07-15 10:27:33.072614] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:55.997 [2024-07-15 10:27:33.072642] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:55.997 BaseBdev2 00:19:55.997 10:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:55.997 10:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:56.255 BaseBdev3_malloc 00:19:56.255 10:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:56.513 true 00:19:56.513 10:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:56.771 [2024-07-15 10:27:33.809498] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:56.772 [2024-07-15 10:27:33.809542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:56.772 [2024-07-15 10:27:33.809564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe2fbd0 00:19:56.772 [2024-07-15 10:27:33.809577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:56.772 [2024-07-15 10:27:33.811152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:56.772 [2024-07-15 10:27:33.811181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:56.772 BaseBdev3 00:19:56.772 10:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:56.772 10:27:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:57.031 BaseBdev4_malloc 00:19:57.031 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:57.290 true 00:19:57.290 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:57.549 [2024-07-15 10:27:34.545318] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:57.550 [2024-07-15 10:27:34.545363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:57.550 [2024-07-15 10:27:34.545386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe30aa0 00:19:57.550 [2024-07-15 10:27:34.545399] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:57.550 [2024-07-15 10:27:34.547016] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:57.550 [2024-07-15 10:27:34.547046] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:57.550 BaseBdev4 00:19:57.550 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:57.809 [2024-07-15 10:27:34.790017] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:57.809 [2024-07-15 10:27:34.791404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:57.809 [2024-07-15 10:27:34.791474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:57.809 [2024-07-15 10:27:34.791534] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:57.809 [2024-07-15 10:27:34.791773] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe2ac20 00:19:57.809 [2024-07-15 10:27:34.791784] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:57.809 [2024-07-15 10:27:34.791994] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc7f260 00:19:57.809 [2024-07-15 10:27:34.792149] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe2ac20 00:19:57.809 [2024-07-15 10:27:34.792173] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe2ac20 00:19:57.809 [2024-07-15 10:27:34.792283] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.809 10:27:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:58.069 10:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.069 "name": "raid_bdev1", 00:19:58.069 "uuid": "a6be6c7b-ccee-49a5-8ba9-d283d4eb6966", 00:19:58.069 "strip_size_kb": 64, 00:19:58.069 "state": "online", 00:19:58.069 "raid_level": "raid0", 00:19:58.069 "superblock": true, 00:19:58.069 "num_base_bdevs": 4, 00:19:58.069 "num_base_bdevs_discovered": 4, 00:19:58.069 "num_base_bdevs_operational": 4, 00:19:58.069 "base_bdevs_list": [ 00:19:58.069 { 00:19:58.069 "name": "BaseBdev1", 00:19:58.069 "uuid": "e9116f93-855e-526f-8f4e-478b4e9901c4", 00:19:58.069 "is_configured": true, 00:19:58.069 "data_offset": 2048, 00:19:58.069 "data_size": 63488 00:19:58.069 }, 00:19:58.069 { 00:19:58.069 "name": "BaseBdev2", 00:19:58.069 "uuid": "a8ce4853-e791-5648-a2ba-d3ae42a1739e", 00:19:58.069 "is_configured": true, 00:19:58.069 "data_offset": 2048, 00:19:58.069 "data_size": 63488 00:19:58.069 }, 00:19:58.069 { 00:19:58.069 "name": "BaseBdev3", 00:19:58.069 "uuid": "73fd9d8b-51a4-5bd6-9ac9-9f8d54178b9c", 00:19:58.069 "is_configured": true, 00:19:58.069 "data_offset": 2048, 00:19:58.069 "data_size": 63488 00:19:58.069 }, 00:19:58.069 { 00:19:58.069 "name": "BaseBdev4", 00:19:58.069 "uuid": "a0cd340a-79d7-55a4-ab24-77884449c5d2", 00:19:58.069 "is_configured": true, 00:19:58.069 "data_offset": 2048, 00:19:58.069 "data_size": 63488 00:19:58.069 } 00:19:58.069 ] 00:19:58.069 }' 00:19:58.069 10:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.069 10:27:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.638 10:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:58.638 10:27:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:58.638 [2024-07-15 10:27:35.688671] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe1cfc0 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.575 10:27:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.834 10:27:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.834 "name": "raid_bdev1", 00:19:59.834 "uuid": "a6be6c7b-ccee-49a5-8ba9-d283d4eb6966", 00:19:59.834 "strip_size_kb": 64, 00:19:59.834 "state": "online", 00:19:59.834 "raid_level": "raid0", 00:19:59.834 "superblock": true, 00:19:59.834 "num_base_bdevs": 4, 00:19:59.834 "num_base_bdevs_discovered": 4, 00:19:59.834 "num_base_bdevs_operational": 4, 00:19:59.834 "base_bdevs_list": [ 00:19:59.834 { 00:19:59.834 "name": "BaseBdev1", 00:19:59.834 "uuid": "e9116f93-855e-526f-8f4e-478b4e9901c4", 00:19:59.834 "is_configured": true, 00:19:59.834 "data_offset": 2048, 00:19:59.834 "data_size": 63488 00:19:59.834 }, 00:19:59.834 { 00:19:59.834 "name": "BaseBdev2", 00:19:59.834 "uuid": "a8ce4853-e791-5648-a2ba-d3ae42a1739e", 00:19:59.835 "is_configured": true, 00:19:59.835 "data_offset": 2048, 00:19:59.835 "data_size": 63488 00:19:59.835 }, 00:19:59.835 { 00:19:59.835 "name": "BaseBdev3", 00:19:59.835 "uuid": "73fd9d8b-51a4-5bd6-9ac9-9f8d54178b9c", 00:19:59.835 "is_configured": true, 00:19:59.835 "data_offset": 2048, 00:19:59.835 "data_size": 63488 00:19:59.835 }, 00:19:59.835 { 00:19:59.835 "name": "BaseBdev4", 00:19:59.835 "uuid": "a0cd340a-79d7-55a4-ab24-77884449c5d2", 00:19:59.835 "is_configured": true, 00:19:59.835 "data_offset": 2048, 00:19:59.835 "data_size": 63488 00:19:59.835 } 00:19:59.835 ] 00:19:59.835 }' 00:19:59.835 10:27:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.835 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:00.802 [2024-07-15 10:27:37.849549] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:00.802 [2024-07-15 10:27:37.849589] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:00.802 [2024-07-15 10:27:37.852755] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:00.802 [2024-07-15 10:27:37.852794] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:00.802 [2024-07-15 10:27:37.852836] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:00.802 [2024-07-15 10:27:37.852847] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe2ac20 name raid_bdev1, state offline 00:20:00.802 0 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 549198 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 549198 ']' 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 549198 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 549198 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 549198' 00:20:00.802 killing process with pid 549198 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 549198 00:20:00.802 [2024-07-15 10:27:37.918100] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:00.802 10:27:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 549198 00:20:00.802 [2024-07-15 10:27:37.949864] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZGmiQZ5jbZ 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:20:01.062 00:20:01.062 real 0m7.619s 00:20:01.062 user 0m12.179s 00:20:01.062 sys 0m1.332s 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:01.062 10:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:01.062 ************************************ 00:20:01.062 END TEST raid_write_error_test 00:20:01.062 ************************************ 00:20:01.062 10:27:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:01.062 10:27:38 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:01.062 10:27:38 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:20:01.062 10:27:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:01.062 10:27:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:01.062 10:27:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:01.322 ************************************ 00:20:01.322 START TEST raid_state_function_test 00:20:01.322 ************************************ 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=550310 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 550310' 00:20:01.322 Process raid pid: 550310 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 550310 /var/tmp/spdk-raid.sock 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 550310 ']' 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:01.322 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:01.322 10:27:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:01.322 [2024-07-15 10:27:38.351729] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:20:01.322 [2024-07-15 10:27:38.351798] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:01.322 [2024-07-15 10:27:38.482019] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:01.582 [2024-07-15 10:27:38.586303] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:01.582 [2024-07-15 10:27:38.644480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:01.582 [2024-07-15 10:27:38.644522] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:02.150 10:27:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:02.150 10:27:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:02.150 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:02.408 [2024-07-15 10:27:39.442347] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:02.408 [2024-07-15 10:27:39.442390] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:02.408 [2024-07-15 10:27:39.442401] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:02.408 [2024-07-15 10:27:39.442414] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:02.408 [2024-07-15 10:27:39.442423] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:02.408 [2024-07-15 10:27:39.442434] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:02.408 [2024-07-15 10:27:39.442443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:02.408 [2024-07-15 10:27:39.442454] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.408 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:02.666 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.666 "name": "Existed_Raid", 00:20:02.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.666 "strip_size_kb": 64, 00:20:02.666 "state": "configuring", 00:20:02.666 "raid_level": "concat", 00:20:02.666 "superblock": false, 00:20:02.666 "num_base_bdevs": 4, 00:20:02.666 "num_base_bdevs_discovered": 0, 00:20:02.666 "num_base_bdevs_operational": 4, 00:20:02.666 "base_bdevs_list": [ 00:20:02.666 { 00:20:02.666 "name": "BaseBdev1", 00:20:02.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.666 "is_configured": false, 00:20:02.666 "data_offset": 0, 00:20:02.666 "data_size": 0 00:20:02.666 }, 00:20:02.666 { 00:20:02.666 "name": "BaseBdev2", 00:20:02.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.666 "is_configured": false, 00:20:02.666 "data_offset": 0, 00:20:02.666 "data_size": 0 00:20:02.666 }, 00:20:02.666 { 00:20:02.666 "name": "BaseBdev3", 00:20:02.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.666 "is_configured": false, 00:20:02.666 "data_offset": 0, 00:20:02.666 "data_size": 0 00:20:02.666 }, 00:20:02.666 { 00:20:02.666 "name": "BaseBdev4", 00:20:02.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.666 "is_configured": false, 00:20:02.666 "data_offset": 0, 00:20:02.666 "data_size": 0 00:20:02.666 } 00:20:02.666 ] 00:20:02.666 }' 00:20:02.666 10:27:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.666 10:27:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.232 10:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:03.491 [2024-07-15 10:27:40.533115] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:03.491 [2024-07-15 10:27:40.533150] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee9aa0 name Existed_Raid, state configuring 00:20:03.491 10:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:03.749 [2024-07-15 10:27:40.705578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:03.749 [2024-07-15 10:27:40.705607] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:03.749 [2024-07-15 10:27:40.705616] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:03.749 [2024-07-15 10:27:40.705628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:03.749 [2024-07-15 10:27:40.705637] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:03.749 [2024-07-15 10:27:40.705648] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:03.749 [2024-07-15 10:27:40.705656] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:03.749 [2024-07-15 10:27:40.705667] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:03.749 10:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:03.749 [2024-07-15 10:27:40.883894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:03.749 BaseBdev1 00:20:03.749 10:27:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:03.749 10:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:03.749 10:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:03.749 10:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:03.750 10:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:03.750 10:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:03.750 10:27:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:04.008 10:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:04.267 [ 00:20:04.267 { 00:20:04.267 "name": "BaseBdev1", 00:20:04.267 "aliases": [ 00:20:04.267 "ee094a2e-481c-400c-8118-81d15c22043b" 00:20:04.267 ], 00:20:04.267 "product_name": "Malloc disk", 00:20:04.267 "block_size": 512, 00:20:04.267 "num_blocks": 65536, 00:20:04.267 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:04.267 "assigned_rate_limits": { 00:20:04.267 "rw_ios_per_sec": 0, 00:20:04.267 "rw_mbytes_per_sec": 0, 00:20:04.267 "r_mbytes_per_sec": 0, 00:20:04.267 "w_mbytes_per_sec": 0 00:20:04.267 }, 00:20:04.267 "claimed": true, 00:20:04.267 "claim_type": "exclusive_write", 00:20:04.267 "zoned": false, 00:20:04.267 "supported_io_types": { 00:20:04.267 "read": true, 00:20:04.267 "write": true, 00:20:04.267 "unmap": true, 00:20:04.267 "flush": true, 00:20:04.267 "reset": true, 00:20:04.267 "nvme_admin": false, 00:20:04.267 "nvme_io": false, 00:20:04.267 "nvme_io_md": false, 00:20:04.267 "write_zeroes": true, 00:20:04.267 "zcopy": true, 00:20:04.267 "get_zone_info": false, 00:20:04.267 "zone_management": false, 00:20:04.267 "zone_append": false, 00:20:04.267 "compare": false, 00:20:04.267 "compare_and_write": false, 00:20:04.267 "abort": true, 00:20:04.267 "seek_hole": false, 00:20:04.267 "seek_data": false, 00:20:04.267 "copy": true, 00:20:04.267 "nvme_iov_md": false 00:20:04.267 }, 00:20:04.267 "memory_domains": [ 00:20:04.267 { 00:20:04.267 "dma_device_id": "system", 00:20:04.267 "dma_device_type": 1 00:20:04.267 }, 00:20:04.267 { 00:20:04.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.267 "dma_device_type": 2 00:20:04.267 } 00:20:04.267 ], 00:20:04.267 "driver_specific": {} 00:20:04.267 } 00:20:04.267 ] 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.267 "name": "Existed_Raid", 00:20:04.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.267 "strip_size_kb": 64, 00:20:04.267 "state": "configuring", 00:20:04.267 "raid_level": "concat", 00:20:04.267 "superblock": false, 00:20:04.267 "num_base_bdevs": 4, 00:20:04.267 "num_base_bdevs_discovered": 1, 00:20:04.267 "num_base_bdevs_operational": 4, 00:20:04.267 "base_bdevs_list": [ 00:20:04.267 { 00:20:04.267 "name": "BaseBdev1", 00:20:04.267 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:04.267 "is_configured": true, 00:20:04.267 "data_offset": 0, 00:20:04.267 "data_size": 65536 00:20:04.267 }, 00:20:04.267 { 00:20:04.267 "name": "BaseBdev2", 00:20:04.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.267 "is_configured": false, 00:20:04.267 "data_offset": 0, 00:20:04.267 "data_size": 0 00:20:04.267 }, 00:20:04.267 { 00:20:04.267 "name": "BaseBdev3", 00:20:04.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.267 "is_configured": false, 00:20:04.267 "data_offset": 0, 00:20:04.267 "data_size": 0 00:20:04.267 }, 00:20:04.267 { 00:20:04.267 "name": "BaseBdev4", 00:20:04.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.267 "is_configured": false, 00:20:04.267 "data_offset": 0, 00:20:04.267 "data_size": 0 00:20:04.267 } 00:20:04.267 ] 00:20:04.267 }' 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.267 10:27:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:04.833 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:05.091 [2024-07-15 10:27:42.247500] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:05.091 [2024-07-15 10:27:42.247549] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xee9310 name Existed_Raid, state configuring 00:20:05.091 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:05.349 [2024-07-15 10:27:42.492198] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:05.349 [2024-07-15 10:27:42.493639] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:05.349 [2024-07-15 10:27:42.493673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:05.349 [2024-07-15 10:27:42.493683] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:05.349 [2024-07-15 10:27:42.493695] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:05.349 [2024-07-15 10:27:42.493704] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:05.349 [2024-07-15 10:27:42.493715] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.349 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:05.607 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:05.607 "name": "Existed_Raid", 00:20:05.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.607 "strip_size_kb": 64, 00:20:05.607 "state": "configuring", 00:20:05.607 "raid_level": "concat", 00:20:05.607 "superblock": false, 00:20:05.607 "num_base_bdevs": 4, 00:20:05.607 "num_base_bdevs_discovered": 1, 00:20:05.607 "num_base_bdevs_operational": 4, 00:20:05.607 "base_bdevs_list": [ 00:20:05.607 { 00:20:05.607 "name": "BaseBdev1", 00:20:05.607 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:05.607 "is_configured": true, 00:20:05.607 "data_offset": 0, 00:20:05.607 "data_size": 65536 00:20:05.607 }, 00:20:05.607 { 00:20:05.607 "name": "BaseBdev2", 00:20:05.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.607 "is_configured": false, 00:20:05.607 "data_offset": 0, 00:20:05.607 "data_size": 0 00:20:05.607 }, 00:20:05.607 { 00:20:05.607 "name": "BaseBdev3", 00:20:05.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.607 "is_configured": false, 00:20:05.607 "data_offset": 0, 00:20:05.607 "data_size": 0 00:20:05.607 }, 00:20:05.607 { 00:20:05.607 "name": "BaseBdev4", 00:20:05.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.607 "is_configured": false, 00:20:05.607 "data_offset": 0, 00:20:05.607 "data_size": 0 00:20:05.607 } 00:20:05.607 ] 00:20:05.607 }' 00:20:05.607 10:27:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:05.607 10:27:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.174 10:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:06.431 [2024-07-15 10:27:43.586590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:06.431 BaseBdev2 00:20:06.431 10:27:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:06.431 10:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:06.431 10:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:06.431 10:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:06.431 10:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:06.431 10:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:06.431 10:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:06.688 10:27:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:06.946 [ 00:20:06.946 { 00:20:06.946 "name": "BaseBdev2", 00:20:06.946 "aliases": [ 00:20:06.946 "08dabd72-0368-4ba2-a262-16458cf7225e" 00:20:06.946 ], 00:20:06.946 "product_name": "Malloc disk", 00:20:06.946 "block_size": 512, 00:20:06.946 "num_blocks": 65536, 00:20:06.946 "uuid": "08dabd72-0368-4ba2-a262-16458cf7225e", 00:20:06.946 "assigned_rate_limits": { 00:20:06.946 "rw_ios_per_sec": 0, 00:20:06.946 "rw_mbytes_per_sec": 0, 00:20:06.946 "r_mbytes_per_sec": 0, 00:20:06.946 "w_mbytes_per_sec": 0 00:20:06.946 }, 00:20:06.946 "claimed": true, 00:20:06.946 "claim_type": "exclusive_write", 00:20:06.946 "zoned": false, 00:20:06.946 "supported_io_types": { 00:20:06.946 "read": true, 00:20:06.946 "write": true, 00:20:06.946 "unmap": true, 00:20:06.946 "flush": true, 00:20:06.946 "reset": true, 00:20:06.946 "nvme_admin": false, 00:20:06.946 "nvme_io": false, 00:20:06.946 "nvme_io_md": false, 00:20:06.946 "write_zeroes": true, 00:20:06.946 "zcopy": true, 00:20:06.946 "get_zone_info": false, 00:20:06.946 "zone_management": false, 00:20:06.946 "zone_append": false, 00:20:06.946 "compare": false, 00:20:06.946 "compare_and_write": false, 00:20:06.946 "abort": true, 00:20:06.946 "seek_hole": false, 00:20:06.946 "seek_data": false, 00:20:06.946 "copy": true, 00:20:06.946 "nvme_iov_md": false 00:20:06.946 }, 00:20:06.946 "memory_domains": [ 00:20:06.946 { 00:20:06.946 "dma_device_id": "system", 00:20:06.946 "dma_device_type": 1 00:20:06.946 }, 00:20:06.946 { 00:20:06.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.946 "dma_device_type": 2 00:20:06.946 } 00:20:06.946 ], 00:20:06.946 "driver_specific": {} 00:20:06.946 } 00:20:06.946 ] 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:06.946 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:06.947 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:06.947 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.947 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.204 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.204 "name": "Existed_Raid", 00:20:07.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.204 "strip_size_kb": 64, 00:20:07.204 "state": "configuring", 00:20:07.204 "raid_level": "concat", 00:20:07.204 "superblock": false, 00:20:07.205 "num_base_bdevs": 4, 00:20:07.205 "num_base_bdevs_discovered": 2, 00:20:07.205 "num_base_bdevs_operational": 4, 00:20:07.205 "base_bdevs_list": [ 00:20:07.205 { 00:20:07.205 "name": "BaseBdev1", 00:20:07.205 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:07.205 "is_configured": true, 00:20:07.205 "data_offset": 0, 00:20:07.205 "data_size": 65536 00:20:07.205 }, 00:20:07.205 { 00:20:07.205 "name": "BaseBdev2", 00:20:07.205 "uuid": "08dabd72-0368-4ba2-a262-16458cf7225e", 00:20:07.205 "is_configured": true, 00:20:07.205 "data_offset": 0, 00:20:07.205 "data_size": 65536 00:20:07.205 }, 00:20:07.205 { 00:20:07.205 "name": "BaseBdev3", 00:20:07.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.205 "is_configured": false, 00:20:07.205 "data_offset": 0, 00:20:07.205 "data_size": 0 00:20:07.205 }, 00:20:07.205 { 00:20:07.205 "name": "BaseBdev4", 00:20:07.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:07.205 "is_configured": false, 00:20:07.205 "data_offset": 0, 00:20:07.205 "data_size": 0 00:20:07.205 } 00:20:07.205 ] 00:20:07.205 }' 00:20:07.205 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.205 10:27:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:07.770 10:27:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:08.027 [2024-07-15 10:27:45.154232] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:08.027 BaseBdev3 00:20:08.027 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:08.027 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:08.027 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:08.027 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:08.027 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:08.027 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:08.027 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:08.285 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:08.542 [ 00:20:08.542 { 00:20:08.542 "name": "BaseBdev3", 00:20:08.542 "aliases": [ 00:20:08.542 "3a690399-741f-4c37-a02a-9899db90d6fe" 00:20:08.542 ], 00:20:08.542 "product_name": "Malloc disk", 00:20:08.542 "block_size": 512, 00:20:08.542 "num_blocks": 65536, 00:20:08.542 "uuid": "3a690399-741f-4c37-a02a-9899db90d6fe", 00:20:08.542 "assigned_rate_limits": { 00:20:08.542 "rw_ios_per_sec": 0, 00:20:08.542 "rw_mbytes_per_sec": 0, 00:20:08.542 "r_mbytes_per_sec": 0, 00:20:08.542 "w_mbytes_per_sec": 0 00:20:08.542 }, 00:20:08.542 "claimed": true, 00:20:08.542 "claim_type": "exclusive_write", 00:20:08.542 "zoned": false, 00:20:08.542 "supported_io_types": { 00:20:08.542 "read": true, 00:20:08.542 "write": true, 00:20:08.542 "unmap": true, 00:20:08.542 "flush": true, 00:20:08.542 "reset": true, 00:20:08.542 "nvme_admin": false, 00:20:08.542 "nvme_io": false, 00:20:08.542 "nvme_io_md": false, 00:20:08.542 "write_zeroes": true, 00:20:08.542 "zcopy": true, 00:20:08.542 "get_zone_info": false, 00:20:08.542 "zone_management": false, 00:20:08.542 "zone_append": false, 00:20:08.542 "compare": false, 00:20:08.542 "compare_and_write": false, 00:20:08.542 "abort": true, 00:20:08.542 "seek_hole": false, 00:20:08.542 "seek_data": false, 00:20:08.542 "copy": true, 00:20:08.542 "nvme_iov_md": false 00:20:08.542 }, 00:20:08.542 "memory_domains": [ 00:20:08.542 { 00:20:08.542 "dma_device_id": "system", 00:20:08.542 "dma_device_type": 1 00:20:08.542 }, 00:20:08.542 { 00:20:08.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.542 "dma_device_type": 2 00:20:08.542 } 00:20:08.542 ], 00:20:08.542 "driver_specific": {} 00:20:08.542 } 00:20:08.542 ] 00:20:08.542 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:08.542 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:08.542 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:08.543 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.800 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.800 "name": "Existed_Raid", 00:20:08.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.800 "strip_size_kb": 64, 00:20:08.800 "state": "configuring", 00:20:08.800 "raid_level": "concat", 00:20:08.800 "superblock": false, 00:20:08.800 "num_base_bdevs": 4, 00:20:08.800 "num_base_bdevs_discovered": 3, 00:20:08.800 "num_base_bdevs_operational": 4, 00:20:08.800 "base_bdevs_list": [ 00:20:08.800 { 00:20:08.800 "name": "BaseBdev1", 00:20:08.800 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:08.800 "is_configured": true, 00:20:08.800 "data_offset": 0, 00:20:08.800 "data_size": 65536 00:20:08.800 }, 00:20:08.800 { 00:20:08.800 "name": "BaseBdev2", 00:20:08.800 "uuid": "08dabd72-0368-4ba2-a262-16458cf7225e", 00:20:08.800 "is_configured": true, 00:20:08.800 "data_offset": 0, 00:20:08.800 "data_size": 65536 00:20:08.800 }, 00:20:08.800 { 00:20:08.800 "name": "BaseBdev3", 00:20:08.801 "uuid": "3a690399-741f-4c37-a02a-9899db90d6fe", 00:20:08.801 "is_configured": true, 00:20:08.801 "data_offset": 0, 00:20:08.801 "data_size": 65536 00:20:08.801 }, 00:20:08.801 { 00:20:08.801 "name": "BaseBdev4", 00:20:08.801 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.801 "is_configured": false, 00:20:08.801 "data_offset": 0, 00:20:08.801 "data_size": 0 00:20:08.801 } 00:20:08.801 ] 00:20:08.801 }' 00:20:08.801 10:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.801 10:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:09.364 10:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:09.622 [2024-07-15 10:27:46.745806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:09.622 [2024-07-15 10:27:46.745844] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xeea350 00:20:09.622 [2024-07-15 10:27:46.745853] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:09.622 [2024-07-15 10:27:46.746112] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xeea020 00:20:09.622 [2024-07-15 10:27:46.746236] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeea350 00:20:09.622 [2024-07-15 10:27:46.746246] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xeea350 00:20:09.622 [2024-07-15 10:27:46.746412] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:09.622 BaseBdev4 00:20:09.622 10:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:09.622 10:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:09.622 10:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:09.622 10:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:09.622 10:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:09.622 10:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:09.622 10:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:09.880 10:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:10.138 [ 00:20:10.138 { 00:20:10.138 "name": "BaseBdev4", 00:20:10.138 "aliases": [ 00:20:10.138 "09a7f093-6e44-4996-ab2e-d3cae281d81d" 00:20:10.138 ], 00:20:10.138 "product_name": "Malloc disk", 00:20:10.138 "block_size": 512, 00:20:10.138 "num_blocks": 65536, 00:20:10.138 "uuid": "09a7f093-6e44-4996-ab2e-d3cae281d81d", 00:20:10.138 "assigned_rate_limits": { 00:20:10.138 "rw_ios_per_sec": 0, 00:20:10.138 "rw_mbytes_per_sec": 0, 00:20:10.138 "r_mbytes_per_sec": 0, 00:20:10.138 "w_mbytes_per_sec": 0 00:20:10.138 }, 00:20:10.138 "claimed": true, 00:20:10.138 "claim_type": "exclusive_write", 00:20:10.138 "zoned": false, 00:20:10.138 "supported_io_types": { 00:20:10.138 "read": true, 00:20:10.138 "write": true, 00:20:10.138 "unmap": true, 00:20:10.138 "flush": true, 00:20:10.138 "reset": true, 00:20:10.138 "nvme_admin": false, 00:20:10.138 "nvme_io": false, 00:20:10.138 "nvme_io_md": false, 00:20:10.138 "write_zeroes": true, 00:20:10.138 "zcopy": true, 00:20:10.138 "get_zone_info": false, 00:20:10.138 "zone_management": false, 00:20:10.138 "zone_append": false, 00:20:10.138 "compare": false, 00:20:10.138 "compare_and_write": false, 00:20:10.138 "abort": true, 00:20:10.138 "seek_hole": false, 00:20:10.138 "seek_data": false, 00:20:10.138 "copy": true, 00:20:10.138 "nvme_iov_md": false 00:20:10.138 }, 00:20:10.138 "memory_domains": [ 00:20:10.138 { 00:20:10.138 "dma_device_id": "system", 00:20:10.138 "dma_device_type": 1 00:20:10.138 }, 00:20:10.138 { 00:20:10.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.138 "dma_device_type": 2 00:20:10.138 } 00:20:10.138 ], 00:20:10.138 "driver_specific": {} 00:20:10.138 } 00:20:10.138 ] 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.138 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:10.396 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.396 "name": "Existed_Raid", 00:20:10.396 "uuid": "5e6fda1f-6e7c-44eb-9857-6f27289aa38b", 00:20:10.396 "strip_size_kb": 64, 00:20:10.396 "state": "online", 00:20:10.396 "raid_level": "concat", 00:20:10.396 "superblock": false, 00:20:10.396 "num_base_bdevs": 4, 00:20:10.396 "num_base_bdevs_discovered": 4, 00:20:10.396 "num_base_bdevs_operational": 4, 00:20:10.396 "base_bdevs_list": [ 00:20:10.396 { 00:20:10.396 "name": "BaseBdev1", 00:20:10.396 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:10.396 "is_configured": true, 00:20:10.396 "data_offset": 0, 00:20:10.396 "data_size": 65536 00:20:10.396 }, 00:20:10.396 { 00:20:10.396 "name": "BaseBdev2", 00:20:10.396 "uuid": "08dabd72-0368-4ba2-a262-16458cf7225e", 00:20:10.396 "is_configured": true, 00:20:10.396 "data_offset": 0, 00:20:10.396 "data_size": 65536 00:20:10.396 }, 00:20:10.396 { 00:20:10.396 "name": "BaseBdev3", 00:20:10.396 "uuid": "3a690399-741f-4c37-a02a-9899db90d6fe", 00:20:10.396 "is_configured": true, 00:20:10.396 "data_offset": 0, 00:20:10.396 "data_size": 65536 00:20:10.396 }, 00:20:10.396 { 00:20:10.396 "name": "BaseBdev4", 00:20:10.396 "uuid": "09a7f093-6e44-4996-ab2e-d3cae281d81d", 00:20:10.396 "is_configured": true, 00:20:10.396 "data_offset": 0, 00:20:10.396 "data_size": 65536 00:20:10.396 } 00:20:10.396 ] 00:20:10.396 }' 00:20:10.396 10:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.396 10:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:10.962 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:11.219 [2024-07-15 10:27:48.322342] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:11.219 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:11.219 "name": "Existed_Raid", 00:20:11.219 "aliases": [ 00:20:11.219 "5e6fda1f-6e7c-44eb-9857-6f27289aa38b" 00:20:11.219 ], 00:20:11.219 "product_name": "Raid Volume", 00:20:11.219 "block_size": 512, 00:20:11.219 "num_blocks": 262144, 00:20:11.219 "uuid": "5e6fda1f-6e7c-44eb-9857-6f27289aa38b", 00:20:11.219 "assigned_rate_limits": { 00:20:11.219 "rw_ios_per_sec": 0, 00:20:11.219 "rw_mbytes_per_sec": 0, 00:20:11.219 "r_mbytes_per_sec": 0, 00:20:11.219 "w_mbytes_per_sec": 0 00:20:11.219 }, 00:20:11.219 "claimed": false, 00:20:11.219 "zoned": false, 00:20:11.219 "supported_io_types": { 00:20:11.219 "read": true, 00:20:11.219 "write": true, 00:20:11.219 "unmap": true, 00:20:11.219 "flush": true, 00:20:11.219 "reset": true, 00:20:11.219 "nvme_admin": false, 00:20:11.219 "nvme_io": false, 00:20:11.219 "nvme_io_md": false, 00:20:11.219 "write_zeroes": true, 00:20:11.219 "zcopy": false, 00:20:11.219 "get_zone_info": false, 00:20:11.219 "zone_management": false, 00:20:11.219 "zone_append": false, 00:20:11.219 "compare": false, 00:20:11.219 "compare_and_write": false, 00:20:11.219 "abort": false, 00:20:11.219 "seek_hole": false, 00:20:11.219 "seek_data": false, 00:20:11.219 "copy": false, 00:20:11.219 "nvme_iov_md": false 00:20:11.219 }, 00:20:11.219 "memory_domains": [ 00:20:11.219 { 00:20:11.219 "dma_device_id": "system", 00:20:11.219 "dma_device_type": 1 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.219 "dma_device_type": 2 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "dma_device_id": "system", 00:20:11.219 "dma_device_type": 1 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.219 "dma_device_type": 2 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "dma_device_id": "system", 00:20:11.219 "dma_device_type": 1 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.219 "dma_device_type": 2 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "dma_device_id": "system", 00:20:11.219 "dma_device_type": 1 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.219 "dma_device_type": 2 00:20:11.219 } 00:20:11.219 ], 00:20:11.219 "driver_specific": { 00:20:11.219 "raid": { 00:20:11.219 "uuid": "5e6fda1f-6e7c-44eb-9857-6f27289aa38b", 00:20:11.219 "strip_size_kb": 64, 00:20:11.219 "state": "online", 00:20:11.219 "raid_level": "concat", 00:20:11.219 "superblock": false, 00:20:11.219 "num_base_bdevs": 4, 00:20:11.219 "num_base_bdevs_discovered": 4, 00:20:11.219 "num_base_bdevs_operational": 4, 00:20:11.219 "base_bdevs_list": [ 00:20:11.219 { 00:20:11.219 "name": "BaseBdev1", 00:20:11.219 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:11.219 "is_configured": true, 00:20:11.219 "data_offset": 0, 00:20:11.219 "data_size": 65536 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "name": "BaseBdev2", 00:20:11.219 "uuid": "08dabd72-0368-4ba2-a262-16458cf7225e", 00:20:11.219 "is_configured": true, 00:20:11.219 "data_offset": 0, 00:20:11.219 "data_size": 65536 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "name": "BaseBdev3", 00:20:11.219 "uuid": "3a690399-741f-4c37-a02a-9899db90d6fe", 00:20:11.219 "is_configured": true, 00:20:11.219 "data_offset": 0, 00:20:11.219 "data_size": 65536 00:20:11.219 }, 00:20:11.219 { 00:20:11.219 "name": "BaseBdev4", 00:20:11.219 "uuid": "09a7f093-6e44-4996-ab2e-d3cae281d81d", 00:20:11.219 "is_configured": true, 00:20:11.219 "data_offset": 0, 00:20:11.219 "data_size": 65536 00:20:11.219 } 00:20:11.219 ] 00:20:11.219 } 00:20:11.219 } 00:20:11.219 }' 00:20:11.219 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:11.219 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:11.219 BaseBdev2 00:20:11.219 BaseBdev3 00:20:11.219 BaseBdev4' 00:20:11.220 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:11.220 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:11.220 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:11.477 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:11.477 "name": "BaseBdev1", 00:20:11.477 "aliases": [ 00:20:11.477 "ee094a2e-481c-400c-8118-81d15c22043b" 00:20:11.477 ], 00:20:11.477 "product_name": "Malloc disk", 00:20:11.477 "block_size": 512, 00:20:11.477 "num_blocks": 65536, 00:20:11.477 "uuid": "ee094a2e-481c-400c-8118-81d15c22043b", 00:20:11.477 "assigned_rate_limits": { 00:20:11.477 "rw_ios_per_sec": 0, 00:20:11.477 "rw_mbytes_per_sec": 0, 00:20:11.477 "r_mbytes_per_sec": 0, 00:20:11.477 "w_mbytes_per_sec": 0 00:20:11.477 }, 00:20:11.477 "claimed": true, 00:20:11.477 "claim_type": "exclusive_write", 00:20:11.477 "zoned": false, 00:20:11.477 "supported_io_types": { 00:20:11.477 "read": true, 00:20:11.477 "write": true, 00:20:11.477 "unmap": true, 00:20:11.477 "flush": true, 00:20:11.477 "reset": true, 00:20:11.477 "nvme_admin": false, 00:20:11.477 "nvme_io": false, 00:20:11.477 "nvme_io_md": false, 00:20:11.477 "write_zeroes": true, 00:20:11.477 "zcopy": true, 00:20:11.477 "get_zone_info": false, 00:20:11.477 "zone_management": false, 00:20:11.477 "zone_append": false, 00:20:11.477 "compare": false, 00:20:11.477 "compare_and_write": false, 00:20:11.477 "abort": true, 00:20:11.477 "seek_hole": false, 00:20:11.477 "seek_data": false, 00:20:11.477 "copy": true, 00:20:11.477 "nvme_iov_md": false 00:20:11.477 }, 00:20:11.477 "memory_domains": [ 00:20:11.477 { 00:20:11.477 "dma_device_id": "system", 00:20:11.477 "dma_device_type": 1 00:20:11.477 }, 00:20:11.477 { 00:20:11.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.477 "dma_device_type": 2 00:20:11.477 } 00:20:11.477 ], 00:20:11.477 "driver_specific": {} 00:20:11.477 }' 00:20:11.477 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:11.734 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.991 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.991 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:11.991 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:11.991 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:11.991 10:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:12.249 "name": "BaseBdev2", 00:20:12.249 "aliases": [ 00:20:12.249 "08dabd72-0368-4ba2-a262-16458cf7225e" 00:20:12.249 ], 00:20:12.249 "product_name": "Malloc disk", 00:20:12.249 "block_size": 512, 00:20:12.249 "num_blocks": 65536, 00:20:12.249 "uuid": "08dabd72-0368-4ba2-a262-16458cf7225e", 00:20:12.249 "assigned_rate_limits": { 00:20:12.249 "rw_ios_per_sec": 0, 00:20:12.249 "rw_mbytes_per_sec": 0, 00:20:12.249 "r_mbytes_per_sec": 0, 00:20:12.249 "w_mbytes_per_sec": 0 00:20:12.249 }, 00:20:12.249 "claimed": true, 00:20:12.249 "claim_type": "exclusive_write", 00:20:12.249 "zoned": false, 00:20:12.249 "supported_io_types": { 00:20:12.249 "read": true, 00:20:12.249 "write": true, 00:20:12.249 "unmap": true, 00:20:12.249 "flush": true, 00:20:12.249 "reset": true, 00:20:12.249 "nvme_admin": false, 00:20:12.249 "nvme_io": false, 00:20:12.249 "nvme_io_md": false, 00:20:12.249 "write_zeroes": true, 00:20:12.249 "zcopy": true, 00:20:12.249 "get_zone_info": false, 00:20:12.249 "zone_management": false, 00:20:12.249 "zone_append": false, 00:20:12.249 "compare": false, 00:20:12.249 "compare_and_write": false, 00:20:12.249 "abort": true, 00:20:12.249 "seek_hole": false, 00:20:12.249 "seek_data": false, 00:20:12.249 "copy": true, 00:20:12.249 "nvme_iov_md": false 00:20:12.249 }, 00:20:12.249 "memory_domains": [ 00:20:12.249 { 00:20:12.249 "dma_device_id": "system", 00:20:12.249 "dma_device_type": 1 00:20:12.249 }, 00:20:12.249 { 00:20:12.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.249 "dma_device_type": 2 00:20:12.249 } 00:20:12.249 ], 00:20:12.249 "driver_specific": {} 00:20:12.249 }' 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:12.249 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:12.507 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:12.766 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:12.766 "name": "BaseBdev3", 00:20:12.766 "aliases": [ 00:20:12.766 "3a690399-741f-4c37-a02a-9899db90d6fe" 00:20:12.766 ], 00:20:12.766 "product_name": "Malloc disk", 00:20:12.766 "block_size": 512, 00:20:12.766 "num_blocks": 65536, 00:20:12.766 "uuid": "3a690399-741f-4c37-a02a-9899db90d6fe", 00:20:12.766 "assigned_rate_limits": { 00:20:12.766 "rw_ios_per_sec": 0, 00:20:12.766 "rw_mbytes_per_sec": 0, 00:20:12.766 "r_mbytes_per_sec": 0, 00:20:12.766 "w_mbytes_per_sec": 0 00:20:12.766 }, 00:20:12.766 "claimed": true, 00:20:12.766 "claim_type": "exclusive_write", 00:20:12.766 "zoned": false, 00:20:12.766 "supported_io_types": { 00:20:12.766 "read": true, 00:20:12.766 "write": true, 00:20:12.766 "unmap": true, 00:20:12.766 "flush": true, 00:20:12.766 "reset": true, 00:20:12.766 "nvme_admin": false, 00:20:12.766 "nvme_io": false, 00:20:12.766 "nvme_io_md": false, 00:20:12.766 "write_zeroes": true, 00:20:12.766 "zcopy": true, 00:20:12.766 "get_zone_info": false, 00:20:12.766 "zone_management": false, 00:20:12.766 "zone_append": false, 00:20:12.766 "compare": false, 00:20:12.766 "compare_and_write": false, 00:20:12.766 "abort": true, 00:20:12.766 "seek_hole": false, 00:20:12.766 "seek_data": false, 00:20:12.766 "copy": true, 00:20:12.766 "nvme_iov_md": false 00:20:12.766 }, 00:20:12.766 "memory_domains": [ 00:20:12.766 { 00:20:12.766 "dma_device_id": "system", 00:20:12.766 "dma_device_type": 1 00:20:12.766 }, 00:20:12.766 { 00:20:12.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.766 "dma_device_type": 2 00:20:12.766 } 00:20:12.766 ], 00:20:12.766 "driver_specific": {} 00:20:12.766 }' 00:20:12.766 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.766 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.766 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:12.766 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.025 10:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:13.025 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:13.283 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:13.283 "name": "BaseBdev4", 00:20:13.283 "aliases": [ 00:20:13.283 "09a7f093-6e44-4996-ab2e-d3cae281d81d" 00:20:13.283 ], 00:20:13.283 "product_name": "Malloc disk", 00:20:13.283 "block_size": 512, 00:20:13.283 "num_blocks": 65536, 00:20:13.283 "uuid": "09a7f093-6e44-4996-ab2e-d3cae281d81d", 00:20:13.283 "assigned_rate_limits": { 00:20:13.283 "rw_ios_per_sec": 0, 00:20:13.283 "rw_mbytes_per_sec": 0, 00:20:13.283 "r_mbytes_per_sec": 0, 00:20:13.283 "w_mbytes_per_sec": 0 00:20:13.283 }, 00:20:13.283 "claimed": true, 00:20:13.283 "claim_type": "exclusive_write", 00:20:13.283 "zoned": false, 00:20:13.283 "supported_io_types": { 00:20:13.283 "read": true, 00:20:13.283 "write": true, 00:20:13.283 "unmap": true, 00:20:13.283 "flush": true, 00:20:13.283 "reset": true, 00:20:13.283 "nvme_admin": false, 00:20:13.283 "nvme_io": false, 00:20:13.283 "nvme_io_md": false, 00:20:13.283 "write_zeroes": true, 00:20:13.283 "zcopy": true, 00:20:13.283 "get_zone_info": false, 00:20:13.283 "zone_management": false, 00:20:13.283 "zone_append": false, 00:20:13.283 "compare": false, 00:20:13.283 "compare_and_write": false, 00:20:13.283 "abort": true, 00:20:13.283 "seek_hole": false, 00:20:13.283 "seek_data": false, 00:20:13.283 "copy": true, 00:20:13.283 "nvme_iov_md": false 00:20:13.283 }, 00:20:13.283 "memory_domains": [ 00:20:13.283 { 00:20:13.283 "dma_device_id": "system", 00:20:13.283 "dma_device_type": 1 00:20:13.283 }, 00:20:13.283 { 00:20:13.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.283 "dma_device_type": 2 00:20:13.283 } 00:20:13.283 ], 00:20:13.283 "driver_specific": {} 00:20:13.283 }' 00:20:13.283 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.283 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.283 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:13.283 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:13.541 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:13.799 [2024-07-15 10:27:50.945049] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:13.799 [2024-07-15 10:27:50.945079] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:13.799 [2024-07-15 10:27:50.945128] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.799 10:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:14.056 10:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.056 "name": "Existed_Raid", 00:20:14.056 "uuid": "5e6fda1f-6e7c-44eb-9857-6f27289aa38b", 00:20:14.056 "strip_size_kb": 64, 00:20:14.056 "state": "offline", 00:20:14.056 "raid_level": "concat", 00:20:14.056 "superblock": false, 00:20:14.056 "num_base_bdevs": 4, 00:20:14.056 "num_base_bdevs_discovered": 3, 00:20:14.056 "num_base_bdevs_operational": 3, 00:20:14.056 "base_bdevs_list": [ 00:20:14.056 { 00:20:14.056 "name": null, 00:20:14.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:14.056 "is_configured": false, 00:20:14.056 "data_offset": 0, 00:20:14.056 "data_size": 65536 00:20:14.056 }, 00:20:14.056 { 00:20:14.056 "name": "BaseBdev2", 00:20:14.056 "uuid": "08dabd72-0368-4ba2-a262-16458cf7225e", 00:20:14.056 "is_configured": true, 00:20:14.056 "data_offset": 0, 00:20:14.056 "data_size": 65536 00:20:14.056 }, 00:20:14.056 { 00:20:14.056 "name": "BaseBdev3", 00:20:14.056 "uuid": "3a690399-741f-4c37-a02a-9899db90d6fe", 00:20:14.056 "is_configured": true, 00:20:14.056 "data_offset": 0, 00:20:14.056 "data_size": 65536 00:20:14.056 }, 00:20:14.056 { 00:20:14.056 "name": "BaseBdev4", 00:20:14.056 "uuid": "09a7f093-6e44-4996-ab2e-d3cae281d81d", 00:20:14.056 "is_configured": true, 00:20:14.057 "data_offset": 0, 00:20:14.057 "data_size": 65536 00:20:14.057 } 00:20:14.057 ] 00:20:14.057 }' 00:20:14.057 10:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.057 10:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:14.620 10:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:14.620 10:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:14.620 10:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:14.620 10:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.877 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:14.877 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:14.877 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:15.135 [2024-07-15 10:27:52.249550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:15.135 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:15.135 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:15.135 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.135 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:15.393 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:15.393 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:15.393 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:15.650 [2024-07-15 10:27:52.755314] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:15.650 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:15.650 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:15.650 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.650 10:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:15.907 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:15.907 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:15.907 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:16.165 [2024-07-15 10:27:53.259187] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:16.165 [2024-07-15 10:27:53.259239] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeea350 name Existed_Raid, state offline 00:20:16.165 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:16.165 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:16.165 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.165 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:16.423 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:16.423 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:16.423 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:16.423 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:16.423 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:16.423 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:16.680 BaseBdev2 00:20:16.680 10:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:16.680 10:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:16.680 10:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:16.680 10:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:16.680 10:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:16.680 10:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:16.680 10:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:16.968 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:17.224 [ 00:20:17.225 { 00:20:17.225 "name": "BaseBdev2", 00:20:17.225 "aliases": [ 00:20:17.225 "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f" 00:20:17.225 ], 00:20:17.225 "product_name": "Malloc disk", 00:20:17.225 "block_size": 512, 00:20:17.225 "num_blocks": 65536, 00:20:17.225 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:17.225 "assigned_rate_limits": { 00:20:17.225 "rw_ios_per_sec": 0, 00:20:17.225 "rw_mbytes_per_sec": 0, 00:20:17.225 "r_mbytes_per_sec": 0, 00:20:17.225 "w_mbytes_per_sec": 0 00:20:17.225 }, 00:20:17.225 "claimed": false, 00:20:17.225 "zoned": false, 00:20:17.225 "supported_io_types": { 00:20:17.225 "read": true, 00:20:17.225 "write": true, 00:20:17.225 "unmap": true, 00:20:17.225 "flush": true, 00:20:17.225 "reset": true, 00:20:17.225 "nvme_admin": false, 00:20:17.225 "nvme_io": false, 00:20:17.225 "nvme_io_md": false, 00:20:17.225 "write_zeroes": true, 00:20:17.225 "zcopy": true, 00:20:17.225 "get_zone_info": false, 00:20:17.225 "zone_management": false, 00:20:17.225 "zone_append": false, 00:20:17.225 "compare": false, 00:20:17.225 "compare_and_write": false, 00:20:17.225 "abort": true, 00:20:17.225 "seek_hole": false, 00:20:17.225 "seek_data": false, 00:20:17.225 "copy": true, 00:20:17.225 "nvme_iov_md": false 00:20:17.225 }, 00:20:17.225 "memory_domains": [ 00:20:17.225 { 00:20:17.225 "dma_device_id": "system", 00:20:17.225 "dma_device_type": 1 00:20:17.225 }, 00:20:17.225 { 00:20:17.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.225 "dma_device_type": 2 00:20:17.225 } 00:20:17.225 ], 00:20:17.225 "driver_specific": {} 00:20:17.225 } 00:20:17.225 ] 00:20:17.225 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:17.225 10:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:17.225 10:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:17.225 10:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:17.482 BaseBdev3 00:20:17.482 10:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:17.482 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:17.482 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:17.482 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:17.482 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:17.482 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:17.482 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:17.739 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:17.996 [ 00:20:17.996 { 00:20:17.996 "name": "BaseBdev3", 00:20:17.996 "aliases": [ 00:20:17.996 "1528e8ed-a2e4-444d-83a2-c3c9172f48cf" 00:20:17.996 ], 00:20:17.996 "product_name": "Malloc disk", 00:20:17.996 "block_size": 512, 00:20:17.996 "num_blocks": 65536, 00:20:17.996 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:17.996 "assigned_rate_limits": { 00:20:17.996 "rw_ios_per_sec": 0, 00:20:17.996 "rw_mbytes_per_sec": 0, 00:20:17.996 "r_mbytes_per_sec": 0, 00:20:17.996 "w_mbytes_per_sec": 0 00:20:17.996 }, 00:20:17.996 "claimed": false, 00:20:17.996 "zoned": false, 00:20:17.996 "supported_io_types": { 00:20:17.996 "read": true, 00:20:17.996 "write": true, 00:20:17.996 "unmap": true, 00:20:17.996 "flush": true, 00:20:17.996 "reset": true, 00:20:17.996 "nvme_admin": false, 00:20:17.996 "nvme_io": false, 00:20:17.996 "nvme_io_md": false, 00:20:17.996 "write_zeroes": true, 00:20:17.996 "zcopy": true, 00:20:17.996 "get_zone_info": false, 00:20:17.996 "zone_management": false, 00:20:17.996 "zone_append": false, 00:20:17.996 "compare": false, 00:20:17.996 "compare_and_write": false, 00:20:17.996 "abort": true, 00:20:17.996 "seek_hole": false, 00:20:17.996 "seek_data": false, 00:20:17.996 "copy": true, 00:20:17.996 "nvme_iov_md": false 00:20:17.996 }, 00:20:17.996 "memory_domains": [ 00:20:17.996 { 00:20:17.996 "dma_device_id": "system", 00:20:17.996 "dma_device_type": 1 00:20:17.996 }, 00:20:17.996 { 00:20:17.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.996 "dma_device_type": 2 00:20:17.996 } 00:20:17.996 ], 00:20:17.996 "driver_specific": {} 00:20:17.996 } 00:20:17.996 ] 00:20:17.996 10:27:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:17.996 10:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:17.996 10:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:17.996 10:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:18.254 BaseBdev4 00:20:18.254 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:18.254 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:18.254 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:18.254 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:18.254 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:18.254 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:18.254 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:18.511 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:18.769 [ 00:20:18.769 { 00:20:18.769 "name": "BaseBdev4", 00:20:18.769 "aliases": [ 00:20:18.769 "48202d71-983d-4aef-8624-b2d28324d463" 00:20:18.769 ], 00:20:18.769 "product_name": "Malloc disk", 00:20:18.769 "block_size": 512, 00:20:18.769 "num_blocks": 65536, 00:20:18.769 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:18.769 "assigned_rate_limits": { 00:20:18.769 "rw_ios_per_sec": 0, 00:20:18.769 "rw_mbytes_per_sec": 0, 00:20:18.769 "r_mbytes_per_sec": 0, 00:20:18.769 "w_mbytes_per_sec": 0 00:20:18.769 }, 00:20:18.769 "claimed": false, 00:20:18.769 "zoned": false, 00:20:18.769 "supported_io_types": { 00:20:18.769 "read": true, 00:20:18.769 "write": true, 00:20:18.769 "unmap": true, 00:20:18.769 "flush": true, 00:20:18.769 "reset": true, 00:20:18.769 "nvme_admin": false, 00:20:18.769 "nvme_io": false, 00:20:18.769 "nvme_io_md": false, 00:20:18.769 "write_zeroes": true, 00:20:18.769 "zcopy": true, 00:20:18.769 "get_zone_info": false, 00:20:18.769 "zone_management": false, 00:20:18.769 "zone_append": false, 00:20:18.769 "compare": false, 00:20:18.769 "compare_and_write": false, 00:20:18.769 "abort": true, 00:20:18.769 "seek_hole": false, 00:20:18.769 "seek_data": false, 00:20:18.769 "copy": true, 00:20:18.769 "nvme_iov_md": false 00:20:18.769 }, 00:20:18.769 "memory_domains": [ 00:20:18.769 { 00:20:18.769 "dma_device_id": "system", 00:20:18.769 "dma_device_type": 1 00:20:18.769 }, 00:20:18.769 { 00:20:18.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.769 "dma_device_type": 2 00:20:18.769 } 00:20:18.769 ], 00:20:18.769 "driver_specific": {} 00:20:18.769 } 00:20:18.769 ] 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:18.769 [2024-07-15 10:27:55.947097] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:18.769 [2024-07-15 10:27:55.947148] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:18.769 [2024-07-15 10:27:55.947169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:18.769 [2024-07-15 10:27:55.948571] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:18.769 [2024-07-15 10:27:55.948616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.769 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.026 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.026 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.026 10:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:19.026 10:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.026 "name": "Existed_Raid", 00:20:19.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.026 "strip_size_kb": 64, 00:20:19.026 "state": "configuring", 00:20:19.026 "raid_level": "concat", 00:20:19.026 "superblock": false, 00:20:19.026 "num_base_bdevs": 4, 00:20:19.026 "num_base_bdevs_discovered": 3, 00:20:19.026 "num_base_bdevs_operational": 4, 00:20:19.026 "base_bdevs_list": [ 00:20:19.026 { 00:20:19.026 "name": "BaseBdev1", 00:20:19.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:19.026 "is_configured": false, 00:20:19.026 "data_offset": 0, 00:20:19.026 "data_size": 0 00:20:19.026 }, 00:20:19.026 { 00:20:19.026 "name": "BaseBdev2", 00:20:19.026 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:19.026 "is_configured": true, 00:20:19.026 "data_offset": 0, 00:20:19.026 "data_size": 65536 00:20:19.026 }, 00:20:19.026 { 00:20:19.026 "name": "BaseBdev3", 00:20:19.026 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:19.026 "is_configured": true, 00:20:19.026 "data_offset": 0, 00:20:19.026 "data_size": 65536 00:20:19.026 }, 00:20:19.026 { 00:20:19.026 "name": "BaseBdev4", 00:20:19.026 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:19.026 "is_configured": true, 00:20:19.026 "data_offset": 0, 00:20:19.026 "data_size": 65536 00:20:19.026 } 00:20:19.026 ] 00:20:19.026 }' 00:20:19.026 10:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.026 10:27:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:19.957 10:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:19.957 [2024-07-15 10:27:57.033963] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.957 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:20.215 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.215 "name": "Existed_Raid", 00:20:20.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.215 "strip_size_kb": 64, 00:20:20.215 "state": "configuring", 00:20:20.215 "raid_level": "concat", 00:20:20.215 "superblock": false, 00:20:20.215 "num_base_bdevs": 4, 00:20:20.215 "num_base_bdevs_discovered": 2, 00:20:20.215 "num_base_bdevs_operational": 4, 00:20:20.215 "base_bdevs_list": [ 00:20:20.215 { 00:20:20.215 "name": "BaseBdev1", 00:20:20.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.215 "is_configured": false, 00:20:20.215 "data_offset": 0, 00:20:20.215 "data_size": 0 00:20:20.215 }, 00:20:20.215 { 00:20:20.215 "name": null, 00:20:20.215 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:20.215 "is_configured": false, 00:20:20.215 "data_offset": 0, 00:20:20.215 "data_size": 65536 00:20:20.215 }, 00:20:20.215 { 00:20:20.215 "name": "BaseBdev3", 00:20:20.215 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:20.215 "is_configured": true, 00:20:20.215 "data_offset": 0, 00:20:20.215 "data_size": 65536 00:20:20.215 }, 00:20:20.215 { 00:20:20.215 "name": "BaseBdev4", 00:20:20.215 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:20.215 "is_configured": true, 00:20:20.215 "data_offset": 0, 00:20:20.215 "data_size": 65536 00:20:20.215 } 00:20:20.215 ] 00:20:20.215 }' 00:20:20.215 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.215 10:27:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.780 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:20.780 10:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.037 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:21.037 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:21.295 [2024-07-15 10:27:58.384954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:21.295 BaseBdev1 00:20:21.295 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:21.295 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:21.295 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:21.295 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:21.295 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:21.295 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:21.295 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:21.553 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:21.811 [ 00:20:21.811 { 00:20:21.811 "name": "BaseBdev1", 00:20:21.811 "aliases": [ 00:20:21.811 "94627db1-6799-407c-8d0f-c67ff83c8bd6" 00:20:21.811 ], 00:20:21.811 "product_name": "Malloc disk", 00:20:21.811 "block_size": 512, 00:20:21.811 "num_blocks": 65536, 00:20:21.811 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:21.811 "assigned_rate_limits": { 00:20:21.811 "rw_ios_per_sec": 0, 00:20:21.811 "rw_mbytes_per_sec": 0, 00:20:21.811 "r_mbytes_per_sec": 0, 00:20:21.811 "w_mbytes_per_sec": 0 00:20:21.811 }, 00:20:21.811 "claimed": true, 00:20:21.811 "claim_type": "exclusive_write", 00:20:21.811 "zoned": false, 00:20:21.811 "supported_io_types": { 00:20:21.811 "read": true, 00:20:21.811 "write": true, 00:20:21.811 "unmap": true, 00:20:21.811 "flush": true, 00:20:21.811 "reset": true, 00:20:21.811 "nvme_admin": false, 00:20:21.811 "nvme_io": false, 00:20:21.811 "nvme_io_md": false, 00:20:21.811 "write_zeroes": true, 00:20:21.811 "zcopy": true, 00:20:21.811 "get_zone_info": false, 00:20:21.811 "zone_management": false, 00:20:21.811 "zone_append": false, 00:20:21.811 "compare": false, 00:20:21.811 "compare_and_write": false, 00:20:21.811 "abort": true, 00:20:21.811 "seek_hole": false, 00:20:21.811 "seek_data": false, 00:20:21.811 "copy": true, 00:20:21.811 "nvme_iov_md": false 00:20:21.811 }, 00:20:21.811 "memory_domains": [ 00:20:21.811 { 00:20:21.811 "dma_device_id": "system", 00:20:21.811 "dma_device_type": 1 00:20:21.811 }, 00:20:21.811 { 00:20:21.811 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.811 "dma_device_type": 2 00:20:21.811 } 00:20:21.811 ], 00:20:21.811 "driver_specific": {} 00:20:21.811 } 00:20:21.811 ] 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.811 10:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:22.069 10:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.069 "name": "Existed_Raid", 00:20:22.069 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.069 "strip_size_kb": 64, 00:20:22.069 "state": "configuring", 00:20:22.069 "raid_level": "concat", 00:20:22.069 "superblock": false, 00:20:22.069 "num_base_bdevs": 4, 00:20:22.069 "num_base_bdevs_discovered": 3, 00:20:22.069 "num_base_bdevs_operational": 4, 00:20:22.069 "base_bdevs_list": [ 00:20:22.069 { 00:20:22.069 "name": "BaseBdev1", 00:20:22.069 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:22.069 "is_configured": true, 00:20:22.069 "data_offset": 0, 00:20:22.069 "data_size": 65536 00:20:22.069 }, 00:20:22.069 { 00:20:22.069 "name": null, 00:20:22.069 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:22.069 "is_configured": false, 00:20:22.069 "data_offset": 0, 00:20:22.069 "data_size": 65536 00:20:22.069 }, 00:20:22.069 { 00:20:22.069 "name": "BaseBdev3", 00:20:22.069 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:22.069 "is_configured": true, 00:20:22.069 "data_offset": 0, 00:20:22.069 "data_size": 65536 00:20:22.069 }, 00:20:22.069 { 00:20:22.069 "name": "BaseBdev4", 00:20:22.069 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:22.069 "is_configured": true, 00:20:22.069 "data_offset": 0, 00:20:22.069 "data_size": 65536 00:20:22.069 } 00:20:22.069 ] 00:20:22.069 }' 00:20:22.069 10:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.069 10:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:22.635 10:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.635 10:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:22.893 10:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:22.893 10:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:23.151 [2024-07-15 10:28:00.213831] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.151 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:23.409 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:23.409 "name": "Existed_Raid", 00:20:23.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:23.409 "strip_size_kb": 64, 00:20:23.409 "state": "configuring", 00:20:23.409 "raid_level": "concat", 00:20:23.409 "superblock": false, 00:20:23.409 "num_base_bdevs": 4, 00:20:23.409 "num_base_bdevs_discovered": 2, 00:20:23.409 "num_base_bdevs_operational": 4, 00:20:23.409 "base_bdevs_list": [ 00:20:23.409 { 00:20:23.409 "name": "BaseBdev1", 00:20:23.409 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:23.409 "is_configured": true, 00:20:23.409 "data_offset": 0, 00:20:23.409 "data_size": 65536 00:20:23.409 }, 00:20:23.409 { 00:20:23.409 "name": null, 00:20:23.409 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:23.409 "is_configured": false, 00:20:23.409 "data_offset": 0, 00:20:23.409 "data_size": 65536 00:20:23.409 }, 00:20:23.409 { 00:20:23.409 "name": null, 00:20:23.409 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:23.409 "is_configured": false, 00:20:23.409 "data_offset": 0, 00:20:23.409 "data_size": 65536 00:20:23.409 }, 00:20:23.409 { 00:20:23.409 "name": "BaseBdev4", 00:20:23.409 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:23.409 "is_configured": true, 00:20:23.409 "data_offset": 0, 00:20:23.409 "data_size": 65536 00:20:23.409 } 00:20:23.409 ] 00:20:23.409 }' 00:20:23.409 10:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:23.409 10:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.975 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.975 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:24.232 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:24.232 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:24.490 [2024-07-15 10:28:01.561533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:24.490 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.748 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.748 "name": "Existed_Raid", 00:20:24.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.748 "strip_size_kb": 64, 00:20:24.748 "state": "configuring", 00:20:24.748 "raid_level": "concat", 00:20:24.748 "superblock": false, 00:20:24.748 "num_base_bdevs": 4, 00:20:24.748 "num_base_bdevs_discovered": 3, 00:20:24.748 "num_base_bdevs_operational": 4, 00:20:24.748 "base_bdevs_list": [ 00:20:24.748 { 00:20:24.748 "name": "BaseBdev1", 00:20:24.748 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:24.748 "is_configured": true, 00:20:24.748 "data_offset": 0, 00:20:24.748 "data_size": 65536 00:20:24.748 }, 00:20:24.748 { 00:20:24.748 "name": null, 00:20:24.748 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:24.748 "is_configured": false, 00:20:24.748 "data_offset": 0, 00:20:24.748 "data_size": 65536 00:20:24.748 }, 00:20:24.748 { 00:20:24.748 "name": "BaseBdev3", 00:20:24.748 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:24.748 "is_configured": true, 00:20:24.748 "data_offset": 0, 00:20:24.748 "data_size": 65536 00:20:24.748 }, 00:20:24.748 { 00:20:24.748 "name": "BaseBdev4", 00:20:24.748 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:24.748 "is_configured": true, 00:20:24.748 "data_offset": 0, 00:20:24.748 "data_size": 65536 00:20:24.748 } 00:20:24.748 ] 00:20:24.748 }' 00:20:24.748 10:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.748 10:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:25.315 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:25.315 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.573 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:25.573 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:25.831 [2024-07-15 10:28:02.901121] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.831 10:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:26.090 10:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:26.090 "name": "Existed_Raid", 00:20:26.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:26.090 "strip_size_kb": 64, 00:20:26.090 "state": "configuring", 00:20:26.090 "raid_level": "concat", 00:20:26.090 "superblock": false, 00:20:26.090 "num_base_bdevs": 4, 00:20:26.090 "num_base_bdevs_discovered": 2, 00:20:26.090 "num_base_bdevs_operational": 4, 00:20:26.090 "base_bdevs_list": [ 00:20:26.090 { 00:20:26.090 "name": null, 00:20:26.090 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:26.090 "is_configured": false, 00:20:26.090 "data_offset": 0, 00:20:26.090 "data_size": 65536 00:20:26.090 }, 00:20:26.090 { 00:20:26.090 "name": null, 00:20:26.090 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:26.090 "is_configured": false, 00:20:26.090 "data_offset": 0, 00:20:26.090 "data_size": 65536 00:20:26.090 }, 00:20:26.090 { 00:20:26.090 "name": "BaseBdev3", 00:20:26.090 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:26.090 "is_configured": true, 00:20:26.090 "data_offset": 0, 00:20:26.090 "data_size": 65536 00:20:26.090 }, 00:20:26.090 { 00:20:26.090 "name": "BaseBdev4", 00:20:26.090 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:26.090 "is_configured": true, 00:20:26.090 "data_offset": 0, 00:20:26.090 "data_size": 65536 00:20:26.090 } 00:20:26.090 ] 00:20:26.090 }' 00:20:26.090 10:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:26.090 10:28:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.655 10:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:26.655 10:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.914 10:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:26.914 10:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:27.172 [2024-07-15 10:28:04.148902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.172 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:27.431 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.431 "name": "Existed_Raid", 00:20:27.431 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.431 "strip_size_kb": 64, 00:20:27.431 "state": "configuring", 00:20:27.431 "raid_level": "concat", 00:20:27.431 "superblock": false, 00:20:27.431 "num_base_bdevs": 4, 00:20:27.431 "num_base_bdevs_discovered": 3, 00:20:27.431 "num_base_bdevs_operational": 4, 00:20:27.431 "base_bdevs_list": [ 00:20:27.431 { 00:20:27.431 "name": null, 00:20:27.431 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:27.431 "is_configured": false, 00:20:27.431 "data_offset": 0, 00:20:27.431 "data_size": 65536 00:20:27.431 }, 00:20:27.431 { 00:20:27.431 "name": "BaseBdev2", 00:20:27.431 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:27.431 "is_configured": true, 00:20:27.431 "data_offset": 0, 00:20:27.431 "data_size": 65536 00:20:27.431 }, 00:20:27.431 { 00:20:27.431 "name": "BaseBdev3", 00:20:27.431 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:27.431 "is_configured": true, 00:20:27.431 "data_offset": 0, 00:20:27.431 "data_size": 65536 00:20:27.431 }, 00:20:27.431 { 00:20:27.431 "name": "BaseBdev4", 00:20:27.431 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:27.431 "is_configured": true, 00:20:27.431 "data_offset": 0, 00:20:27.431 "data_size": 65536 00:20:27.431 } 00:20:27.431 ] 00:20:27.431 }' 00:20:27.431 10:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.431 10:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.998 10:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.998 10:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:28.256 10:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:28.256 10:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.256 10:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:28.515 10:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 94627db1-6799-407c-8d0f-c67ff83c8bd6 00:20:28.773 [2024-07-15 10:28:05.737668] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:28.773 [2024-07-15 10:28:05.737716] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xeee040 00:20:28.773 [2024-07-15 10:28:05.737725] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:20:28.773 [2024-07-15 10:28:05.737934] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xee9a70 00:20:28.773 [2024-07-15 10:28:05.738059] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeee040 00:20:28.773 [2024-07-15 10:28:05.738069] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xeee040 00:20:28.773 [2024-07-15 10:28:05.738240] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:28.773 NewBaseBdev 00:20:28.773 10:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:28.773 10:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:28.773 10:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:28.773 10:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:28.773 10:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:28.773 10:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:28.773 10:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:29.031 10:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:29.290 [ 00:20:29.290 { 00:20:29.290 "name": "NewBaseBdev", 00:20:29.290 "aliases": [ 00:20:29.290 "94627db1-6799-407c-8d0f-c67ff83c8bd6" 00:20:29.290 ], 00:20:29.290 "product_name": "Malloc disk", 00:20:29.290 "block_size": 512, 00:20:29.290 "num_blocks": 65536, 00:20:29.290 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:29.290 "assigned_rate_limits": { 00:20:29.290 "rw_ios_per_sec": 0, 00:20:29.290 "rw_mbytes_per_sec": 0, 00:20:29.290 "r_mbytes_per_sec": 0, 00:20:29.290 "w_mbytes_per_sec": 0 00:20:29.290 }, 00:20:29.290 "claimed": true, 00:20:29.290 "claim_type": "exclusive_write", 00:20:29.290 "zoned": false, 00:20:29.290 "supported_io_types": { 00:20:29.290 "read": true, 00:20:29.290 "write": true, 00:20:29.290 "unmap": true, 00:20:29.290 "flush": true, 00:20:29.290 "reset": true, 00:20:29.290 "nvme_admin": false, 00:20:29.290 "nvme_io": false, 00:20:29.290 "nvme_io_md": false, 00:20:29.290 "write_zeroes": true, 00:20:29.290 "zcopy": true, 00:20:29.290 "get_zone_info": false, 00:20:29.290 "zone_management": false, 00:20:29.290 "zone_append": false, 00:20:29.290 "compare": false, 00:20:29.290 "compare_and_write": false, 00:20:29.290 "abort": true, 00:20:29.290 "seek_hole": false, 00:20:29.290 "seek_data": false, 00:20:29.290 "copy": true, 00:20:29.290 "nvme_iov_md": false 00:20:29.290 }, 00:20:29.290 "memory_domains": [ 00:20:29.290 { 00:20:29.290 "dma_device_id": "system", 00:20:29.290 "dma_device_type": 1 00:20:29.290 }, 00:20:29.290 { 00:20:29.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.290 "dma_device_type": 2 00:20:29.290 } 00:20:29.290 ], 00:20:29.290 "driver_specific": {} 00:20:29.290 } 00:20:29.290 ] 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.290 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:29.549 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:29.549 "name": "Existed_Raid", 00:20:29.549 "uuid": "440a4ddb-9630-4de5-96a0-23c3f885e63d", 00:20:29.549 "strip_size_kb": 64, 00:20:29.549 "state": "online", 00:20:29.549 "raid_level": "concat", 00:20:29.549 "superblock": false, 00:20:29.549 "num_base_bdevs": 4, 00:20:29.549 "num_base_bdevs_discovered": 4, 00:20:29.549 "num_base_bdevs_operational": 4, 00:20:29.549 "base_bdevs_list": [ 00:20:29.549 { 00:20:29.549 "name": "NewBaseBdev", 00:20:29.549 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:29.549 "is_configured": true, 00:20:29.549 "data_offset": 0, 00:20:29.549 "data_size": 65536 00:20:29.549 }, 00:20:29.549 { 00:20:29.549 "name": "BaseBdev2", 00:20:29.549 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:29.549 "is_configured": true, 00:20:29.549 "data_offset": 0, 00:20:29.549 "data_size": 65536 00:20:29.549 }, 00:20:29.549 { 00:20:29.549 "name": "BaseBdev3", 00:20:29.549 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:29.549 "is_configured": true, 00:20:29.549 "data_offset": 0, 00:20:29.549 "data_size": 65536 00:20:29.549 }, 00:20:29.549 { 00:20:29.549 "name": "BaseBdev4", 00:20:29.549 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:29.549 "is_configured": true, 00:20:29.549 "data_offset": 0, 00:20:29.549 "data_size": 65536 00:20:29.549 } 00:20:29.549 ] 00:20:29.549 }' 00:20:29.549 10:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:29.549 10:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:30.116 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:30.373 [2024-07-15 10:28:07.354289] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:30.373 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:30.373 "name": "Existed_Raid", 00:20:30.373 "aliases": [ 00:20:30.373 "440a4ddb-9630-4de5-96a0-23c3f885e63d" 00:20:30.373 ], 00:20:30.373 "product_name": "Raid Volume", 00:20:30.373 "block_size": 512, 00:20:30.373 "num_blocks": 262144, 00:20:30.373 "uuid": "440a4ddb-9630-4de5-96a0-23c3f885e63d", 00:20:30.373 "assigned_rate_limits": { 00:20:30.373 "rw_ios_per_sec": 0, 00:20:30.373 "rw_mbytes_per_sec": 0, 00:20:30.373 "r_mbytes_per_sec": 0, 00:20:30.373 "w_mbytes_per_sec": 0 00:20:30.373 }, 00:20:30.373 "claimed": false, 00:20:30.373 "zoned": false, 00:20:30.373 "supported_io_types": { 00:20:30.373 "read": true, 00:20:30.373 "write": true, 00:20:30.373 "unmap": true, 00:20:30.373 "flush": true, 00:20:30.373 "reset": true, 00:20:30.373 "nvme_admin": false, 00:20:30.373 "nvme_io": false, 00:20:30.373 "nvme_io_md": false, 00:20:30.373 "write_zeroes": true, 00:20:30.373 "zcopy": false, 00:20:30.373 "get_zone_info": false, 00:20:30.373 "zone_management": false, 00:20:30.373 "zone_append": false, 00:20:30.373 "compare": false, 00:20:30.373 "compare_and_write": false, 00:20:30.373 "abort": false, 00:20:30.373 "seek_hole": false, 00:20:30.373 "seek_data": false, 00:20:30.373 "copy": false, 00:20:30.373 "nvme_iov_md": false 00:20:30.373 }, 00:20:30.373 "memory_domains": [ 00:20:30.373 { 00:20:30.373 "dma_device_id": "system", 00:20:30.373 "dma_device_type": 1 00:20:30.373 }, 00:20:30.373 { 00:20:30.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.373 "dma_device_type": 2 00:20:30.373 }, 00:20:30.373 { 00:20:30.373 "dma_device_id": "system", 00:20:30.373 "dma_device_type": 1 00:20:30.373 }, 00:20:30.373 { 00:20:30.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.373 "dma_device_type": 2 00:20:30.373 }, 00:20:30.373 { 00:20:30.373 "dma_device_id": "system", 00:20:30.373 "dma_device_type": 1 00:20:30.373 }, 00:20:30.373 { 00:20:30.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.374 "dma_device_type": 2 00:20:30.374 }, 00:20:30.374 { 00:20:30.374 "dma_device_id": "system", 00:20:30.374 "dma_device_type": 1 00:20:30.374 }, 00:20:30.374 { 00:20:30.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.374 "dma_device_type": 2 00:20:30.374 } 00:20:30.374 ], 00:20:30.374 "driver_specific": { 00:20:30.374 "raid": { 00:20:30.374 "uuid": "440a4ddb-9630-4de5-96a0-23c3f885e63d", 00:20:30.374 "strip_size_kb": 64, 00:20:30.374 "state": "online", 00:20:30.374 "raid_level": "concat", 00:20:30.374 "superblock": false, 00:20:30.374 "num_base_bdevs": 4, 00:20:30.374 "num_base_bdevs_discovered": 4, 00:20:30.374 "num_base_bdevs_operational": 4, 00:20:30.374 "base_bdevs_list": [ 00:20:30.374 { 00:20:30.374 "name": "NewBaseBdev", 00:20:30.374 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:30.374 "is_configured": true, 00:20:30.374 "data_offset": 0, 00:20:30.374 "data_size": 65536 00:20:30.374 }, 00:20:30.374 { 00:20:30.374 "name": "BaseBdev2", 00:20:30.374 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:30.374 "is_configured": true, 00:20:30.374 "data_offset": 0, 00:20:30.374 "data_size": 65536 00:20:30.374 }, 00:20:30.374 { 00:20:30.374 "name": "BaseBdev3", 00:20:30.374 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:30.374 "is_configured": true, 00:20:30.374 "data_offset": 0, 00:20:30.374 "data_size": 65536 00:20:30.374 }, 00:20:30.374 { 00:20:30.374 "name": "BaseBdev4", 00:20:30.374 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:30.374 "is_configured": true, 00:20:30.374 "data_offset": 0, 00:20:30.374 "data_size": 65536 00:20:30.374 } 00:20:30.374 ] 00:20:30.374 } 00:20:30.374 } 00:20:30.374 }' 00:20:30.374 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:30.374 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:30.374 BaseBdev2 00:20:30.374 BaseBdev3 00:20:30.374 BaseBdev4' 00:20:30.374 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:30.374 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:30.374 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:30.631 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:30.631 "name": "NewBaseBdev", 00:20:30.631 "aliases": [ 00:20:30.631 "94627db1-6799-407c-8d0f-c67ff83c8bd6" 00:20:30.631 ], 00:20:30.631 "product_name": "Malloc disk", 00:20:30.631 "block_size": 512, 00:20:30.631 "num_blocks": 65536, 00:20:30.631 "uuid": "94627db1-6799-407c-8d0f-c67ff83c8bd6", 00:20:30.631 "assigned_rate_limits": { 00:20:30.631 "rw_ios_per_sec": 0, 00:20:30.631 "rw_mbytes_per_sec": 0, 00:20:30.631 "r_mbytes_per_sec": 0, 00:20:30.631 "w_mbytes_per_sec": 0 00:20:30.631 }, 00:20:30.631 "claimed": true, 00:20:30.631 "claim_type": "exclusive_write", 00:20:30.631 "zoned": false, 00:20:30.631 "supported_io_types": { 00:20:30.631 "read": true, 00:20:30.631 "write": true, 00:20:30.631 "unmap": true, 00:20:30.631 "flush": true, 00:20:30.631 "reset": true, 00:20:30.631 "nvme_admin": false, 00:20:30.631 "nvme_io": false, 00:20:30.631 "nvme_io_md": false, 00:20:30.631 "write_zeroes": true, 00:20:30.631 "zcopy": true, 00:20:30.631 "get_zone_info": false, 00:20:30.631 "zone_management": false, 00:20:30.631 "zone_append": false, 00:20:30.631 "compare": false, 00:20:30.631 "compare_and_write": false, 00:20:30.631 "abort": true, 00:20:30.631 "seek_hole": false, 00:20:30.631 "seek_data": false, 00:20:30.631 "copy": true, 00:20:30.631 "nvme_iov_md": false 00:20:30.631 }, 00:20:30.631 "memory_domains": [ 00:20:30.631 { 00:20:30.631 "dma_device_id": "system", 00:20:30.631 "dma_device_type": 1 00:20:30.631 }, 00:20:30.631 { 00:20:30.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.631 "dma_device_type": 2 00:20:30.631 } 00:20:30.631 ], 00:20:30.631 "driver_specific": {} 00:20:30.631 }' 00:20:30.631 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.631 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:30.631 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:30.631 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.631 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.888 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:30.889 10:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.146 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.147 "name": "BaseBdev2", 00:20:31.147 "aliases": [ 00:20:31.147 "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f" 00:20:31.147 ], 00:20:31.147 "product_name": "Malloc disk", 00:20:31.147 "block_size": 512, 00:20:31.147 "num_blocks": 65536, 00:20:31.147 "uuid": "3a85a6a7-078d-4c2f-a1e7-ea0a6a7ed93f", 00:20:31.147 "assigned_rate_limits": { 00:20:31.147 "rw_ios_per_sec": 0, 00:20:31.147 "rw_mbytes_per_sec": 0, 00:20:31.147 "r_mbytes_per_sec": 0, 00:20:31.147 "w_mbytes_per_sec": 0 00:20:31.147 }, 00:20:31.147 "claimed": true, 00:20:31.147 "claim_type": "exclusive_write", 00:20:31.147 "zoned": false, 00:20:31.147 "supported_io_types": { 00:20:31.147 "read": true, 00:20:31.147 "write": true, 00:20:31.147 "unmap": true, 00:20:31.147 "flush": true, 00:20:31.147 "reset": true, 00:20:31.147 "nvme_admin": false, 00:20:31.147 "nvme_io": false, 00:20:31.147 "nvme_io_md": false, 00:20:31.147 "write_zeroes": true, 00:20:31.147 "zcopy": true, 00:20:31.147 "get_zone_info": false, 00:20:31.147 "zone_management": false, 00:20:31.147 "zone_append": false, 00:20:31.147 "compare": false, 00:20:31.147 "compare_and_write": false, 00:20:31.147 "abort": true, 00:20:31.147 "seek_hole": false, 00:20:31.147 "seek_data": false, 00:20:31.147 "copy": true, 00:20:31.147 "nvme_iov_md": false 00:20:31.147 }, 00:20:31.147 "memory_domains": [ 00:20:31.147 { 00:20:31.147 "dma_device_id": "system", 00:20:31.147 "dma_device_type": 1 00:20:31.147 }, 00:20:31.147 { 00:20:31.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.147 "dma_device_type": 2 00:20:31.147 } 00:20:31.147 ], 00:20:31.147 "driver_specific": {} 00:20:31.147 }' 00:20:31.147 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.147 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.147 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.147 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.147 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.147 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.147 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:31.405 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.661 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:31.661 "name": "BaseBdev3", 00:20:31.661 "aliases": [ 00:20:31.661 "1528e8ed-a2e4-444d-83a2-c3c9172f48cf" 00:20:31.661 ], 00:20:31.661 "product_name": "Malloc disk", 00:20:31.661 "block_size": 512, 00:20:31.661 "num_blocks": 65536, 00:20:31.661 "uuid": "1528e8ed-a2e4-444d-83a2-c3c9172f48cf", 00:20:31.661 "assigned_rate_limits": { 00:20:31.661 "rw_ios_per_sec": 0, 00:20:31.661 "rw_mbytes_per_sec": 0, 00:20:31.661 "r_mbytes_per_sec": 0, 00:20:31.661 "w_mbytes_per_sec": 0 00:20:31.661 }, 00:20:31.661 "claimed": true, 00:20:31.661 "claim_type": "exclusive_write", 00:20:31.661 "zoned": false, 00:20:31.661 "supported_io_types": { 00:20:31.661 "read": true, 00:20:31.661 "write": true, 00:20:31.661 "unmap": true, 00:20:31.661 "flush": true, 00:20:31.661 "reset": true, 00:20:31.661 "nvme_admin": false, 00:20:31.661 "nvme_io": false, 00:20:31.661 "nvme_io_md": false, 00:20:31.661 "write_zeroes": true, 00:20:31.661 "zcopy": true, 00:20:31.661 "get_zone_info": false, 00:20:31.661 "zone_management": false, 00:20:31.661 "zone_append": false, 00:20:31.661 "compare": false, 00:20:31.662 "compare_and_write": false, 00:20:31.662 "abort": true, 00:20:31.662 "seek_hole": false, 00:20:31.662 "seek_data": false, 00:20:31.662 "copy": true, 00:20:31.662 "nvme_iov_md": false 00:20:31.662 }, 00:20:31.662 "memory_domains": [ 00:20:31.662 { 00:20:31.662 "dma_device_id": "system", 00:20:31.662 "dma_device_type": 1 00:20:31.662 }, 00:20:31.662 { 00:20:31.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.662 "dma_device_type": 2 00:20:31.662 } 00:20:31.662 ], 00:20:31.662 "driver_specific": {} 00:20:31.662 }' 00:20:31.662 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.662 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:31.662 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:31.662 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.662 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:31.662 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:31.662 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.917 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:31.918 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:31.918 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.918 10:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:31.918 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:31.918 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:31.918 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:31.918 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:32.482 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.482 "name": "BaseBdev4", 00:20:32.482 "aliases": [ 00:20:32.482 "48202d71-983d-4aef-8624-b2d28324d463" 00:20:32.482 ], 00:20:32.482 "product_name": "Malloc disk", 00:20:32.482 "block_size": 512, 00:20:32.482 "num_blocks": 65536, 00:20:32.482 "uuid": "48202d71-983d-4aef-8624-b2d28324d463", 00:20:32.482 "assigned_rate_limits": { 00:20:32.482 "rw_ios_per_sec": 0, 00:20:32.482 "rw_mbytes_per_sec": 0, 00:20:32.482 "r_mbytes_per_sec": 0, 00:20:32.482 "w_mbytes_per_sec": 0 00:20:32.482 }, 00:20:32.482 "claimed": true, 00:20:32.482 "claim_type": "exclusive_write", 00:20:32.482 "zoned": false, 00:20:32.482 "supported_io_types": { 00:20:32.482 "read": true, 00:20:32.482 "write": true, 00:20:32.482 "unmap": true, 00:20:32.482 "flush": true, 00:20:32.482 "reset": true, 00:20:32.482 "nvme_admin": false, 00:20:32.482 "nvme_io": false, 00:20:32.482 "nvme_io_md": false, 00:20:32.482 "write_zeroes": true, 00:20:32.482 "zcopy": true, 00:20:32.482 "get_zone_info": false, 00:20:32.482 "zone_management": false, 00:20:32.482 "zone_append": false, 00:20:32.482 "compare": false, 00:20:32.482 "compare_and_write": false, 00:20:32.482 "abort": true, 00:20:32.482 "seek_hole": false, 00:20:32.482 "seek_data": false, 00:20:32.482 "copy": true, 00:20:32.482 "nvme_iov_md": false 00:20:32.482 }, 00:20:32.482 "memory_domains": [ 00:20:32.482 { 00:20:32.482 "dma_device_id": "system", 00:20:32.482 "dma_device_type": 1 00:20:32.482 }, 00:20:32.482 { 00:20:32.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.482 "dma_device_type": 2 00:20:32.482 } 00:20:32.482 ], 00:20:32.482 "driver_specific": {} 00:20:32.482 }' 00:20:32.482 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.482 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.482 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.482 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.482 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:32.739 10:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:32.997 [2024-07-15 10:28:10.041281] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:32.997 [2024-07-15 10:28:10.041315] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:32.997 [2024-07-15 10:28:10.041375] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:32.997 [2024-07-15 10:28:10.041437] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:32.997 [2024-07-15 10:28:10.041449] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeee040 name Existed_Raid, state offline 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 550310 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 550310 ']' 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 550310 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 550310 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 550310' 00:20:32.997 killing process with pid 550310 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 550310 00:20:32.997 [2024-07-15 10:28:10.108871] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:32.997 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 550310 00:20:32.997 [2024-07-15 10:28:10.151086] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:33.255 10:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:33.255 00:20:33.255 real 0m32.098s 00:20:33.255 user 0m58.760s 00:20:33.255 sys 0m5.787s 00:20:33.255 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:33.255 10:28:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.255 ************************************ 00:20:33.255 END TEST raid_state_function_test 00:20:33.255 ************************************ 00:20:33.255 10:28:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:33.255 10:28:10 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:20:33.255 10:28:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:33.255 10:28:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:33.255 10:28:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:33.554 ************************************ 00:20:33.554 START TEST raid_state_function_test_sb 00:20:33.554 ************************************ 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:33.554 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=555070 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 555070' 00:20:33.555 Process raid pid: 555070 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 555070 /var/tmp/spdk-raid.sock 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 555070 ']' 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:33.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:33.555 10:28:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:33.555 [2024-07-15 10:28:10.531321] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:20:33.555 [2024-07-15 10:28:10.531390] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:33.555 [2024-07-15 10:28:10.664527] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:33.813 [2024-07-15 10:28:10.768116] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:33.813 [2024-07-15 10:28:10.826106] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:33.813 [2024-07-15 10:28:10.826139] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:34.377 10:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:34.377 10:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:20:34.377 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:34.635 [2024-07-15 10:28:11.697711] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:34.635 [2024-07-15 10:28:11.697755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:34.635 [2024-07-15 10:28:11.697766] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:34.635 [2024-07-15 10:28:11.697778] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:34.635 [2024-07-15 10:28:11.697787] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:34.635 [2024-07-15 10:28:11.697798] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:34.635 [2024-07-15 10:28:11.697807] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:34.635 [2024-07-15 10:28:11.697818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.635 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:34.893 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.893 "name": "Existed_Raid", 00:20:34.893 "uuid": "a2d43c09-8de0-494e-aeea-02d92bc6f087", 00:20:34.893 "strip_size_kb": 64, 00:20:34.893 "state": "configuring", 00:20:34.893 "raid_level": "concat", 00:20:34.893 "superblock": true, 00:20:34.893 "num_base_bdevs": 4, 00:20:34.893 "num_base_bdevs_discovered": 0, 00:20:34.893 "num_base_bdevs_operational": 4, 00:20:34.893 "base_bdevs_list": [ 00:20:34.893 { 00:20:34.893 "name": "BaseBdev1", 00:20:34.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.893 "is_configured": false, 00:20:34.893 "data_offset": 0, 00:20:34.893 "data_size": 0 00:20:34.893 }, 00:20:34.893 { 00:20:34.893 "name": "BaseBdev2", 00:20:34.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.893 "is_configured": false, 00:20:34.894 "data_offset": 0, 00:20:34.894 "data_size": 0 00:20:34.894 }, 00:20:34.894 { 00:20:34.894 "name": "BaseBdev3", 00:20:34.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.894 "is_configured": false, 00:20:34.894 "data_offset": 0, 00:20:34.894 "data_size": 0 00:20:34.894 }, 00:20:34.894 { 00:20:34.894 "name": "BaseBdev4", 00:20:34.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.894 "is_configured": false, 00:20:34.894 "data_offset": 0, 00:20:34.894 "data_size": 0 00:20:34.894 } 00:20:34.894 ] 00:20:34.894 }' 00:20:34.894 10:28:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.894 10:28:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.459 10:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:35.717 [2024-07-15 10:28:12.684189] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:35.717 [2024-07-15 10:28:12.684226] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbbaa0 name Existed_Raid, state configuring 00:20:35.718 10:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:35.976 [2024-07-15 10:28:12.928862] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:35.976 [2024-07-15 10:28:12.928895] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:35.976 [2024-07-15 10:28:12.928905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:35.976 [2024-07-15 10:28:12.928917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:35.976 [2024-07-15 10:28:12.928930] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:35.976 [2024-07-15 10:28:12.928941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:35.976 [2024-07-15 10:28:12.928950] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:35.976 [2024-07-15 10:28:12.928960] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:35.976 10:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:35.976 [2024-07-15 10:28:13.104434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:35.976 BaseBdev1 00:20:35.976 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:35.976 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:35.976 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:35.976 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:35.976 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:35.976 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:35.976 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:36.234 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:36.492 [ 00:20:36.492 { 00:20:36.492 "name": "BaseBdev1", 00:20:36.492 "aliases": [ 00:20:36.492 "b07d44a1-595c-412d-acf0-101f8f82d2fb" 00:20:36.492 ], 00:20:36.492 "product_name": "Malloc disk", 00:20:36.492 "block_size": 512, 00:20:36.492 "num_blocks": 65536, 00:20:36.492 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:36.492 "assigned_rate_limits": { 00:20:36.492 "rw_ios_per_sec": 0, 00:20:36.492 "rw_mbytes_per_sec": 0, 00:20:36.492 "r_mbytes_per_sec": 0, 00:20:36.492 "w_mbytes_per_sec": 0 00:20:36.492 }, 00:20:36.492 "claimed": true, 00:20:36.492 "claim_type": "exclusive_write", 00:20:36.492 "zoned": false, 00:20:36.492 "supported_io_types": { 00:20:36.492 "read": true, 00:20:36.492 "write": true, 00:20:36.492 "unmap": true, 00:20:36.492 "flush": true, 00:20:36.492 "reset": true, 00:20:36.492 "nvme_admin": false, 00:20:36.492 "nvme_io": false, 00:20:36.492 "nvme_io_md": false, 00:20:36.492 "write_zeroes": true, 00:20:36.492 "zcopy": true, 00:20:36.492 "get_zone_info": false, 00:20:36.492 "zone_management": false, 00:20:36.492 "zone_append": false, 00:20:36.492 "compare": false, 00:20:36.492 "compare_and_write": false, 00:20:36.492 "abort": true, 00:20:36.492 "seek_hole": false, 00:20:36.492 "seek_data": false, 00:20:36.492 "copy": true, 00:20:36.492 "nvme_iov_md": false 00:20:36.492 }, 00:20:36.492 "memory_domains": [ 00:20:36.492 { 00:20:36.492 "dma_device_id": "system", 00:20:36.493 "dma_device_type": 1 00:20:36.493 }, 00:20:36.493 { 00:20:36.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.493 "dma_device_type": 2 00:20:36.493 } 00:20:36.493 ], 00:20:36.493 "driver_specific": {} 00:20:36.493 } 00:20:36.493 ] 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.493 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:36.751 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.751 "name": "Existed_Raid", 00:20:36.751 "uuid": "08803942-d44e-4c22-818c-40d2af9f9ffe", 00:20:36.751 "strip_size_kb": 64, 00:20:36.751 "state": "configuring", 00:20:36.751 "raid_level": "concat", 00:20:36.751 "superblock": true, 00:20:36.751 "num_base_bdevs": 4, 00:20:36.751 "num_base_bdevs_discovered": 1, 00:20:36.751 "num_base_bdevs_operational": 4, 00:20:36.751 "base_bdevs_list": [ 00:20:36.751 { 00:20:36.751 "name": "BaseBdev1", 00:20:36.751 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:36.751 "is_configured": true, 00:20:36.751 "data_offset": 2048, 00:20:36.751 "data_size": 63488 00:20:36.751 }, 00:20:36.751 { 00:20:36.751 "name": "BaseBdev2", 00:20:36.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.751 "is_configured": false, 00:20:36.751 "data_offset": 0, 00:20:36.751 "data_size": 0 00:20:36.751 }, 00:20:36.751 { 00:20:36.751 "name": "BaseBdev3", 00:20:36.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.751 "is_configured": false, 00:20:36.751 "data_offset": 0, 00:20:36.751 "data_size": 0 00:20:36.751 }, 00:20:36.751 { 00:20:36.751 "name": "BaseBdev4", 00:20:36.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.751 "is_configured": false, 00:20:36.751 "data_offset": 0, 00:20:36.751 "data_size": 0 00:20:36.751 } 00:20:36.751 ] 00:20:36.751 }' 00:20:36.751 10:28:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.751 10:28:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:37.317 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:37.575 [2024-07-15 10:28:14.620470] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:37.575 [2024-07-15 10:28:14.620509] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbb310 name Existed_Raid, state configuring 00:20:37.575 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:37.833 [2024-07-15 10:28:14.801004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:37.833 [2024-07-15 10:28:14.802448] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:37.833 [2024-07-15 10:28:14.802483] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:37.833 [2024-07-15 10:28:14.802494] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:37.833 [2024-07-15 10:28:14.802511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:37.833 [2024-07-15 10:28:14.802520] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:37.834 [2024-07-15 10:28:14.802531] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.834 10:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:37.834 10:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.834 "name": "Existed_Raid", 00:20:37.834 "uuid": "81ae4c11-eb1c-44ec-92e4-8ac61786eb20", 00:20:37.834 "strip_size_kb": 64, 00:20:37.834 "state": "configuring", 00:20:37.834 "raid_level": "concat", 00:20:37.834 "superblock": true, 00:20:37.834 "num_base_bdevs": 4, 00:20:37.834 "num_base_bdevs_discovered": 1, 00:20:37.834 "num_base_bdevs_operational": 4, 00:20:37.834 "base_bdevs_list": [ 00:20:37.834 { 00:20:37.834 "name": "BaseBdev1", 00:20:37.834 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:37.834 "is_configured": true, 00:20:37.834 "data_offset": 2048, 00:20:37.834 "data_size": 63488 00:20:37.834 }, 00:20:37.834 { 00:20:37.834 "name": "BaseBdev2", 00:20:37.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.834 "is_configured": false, 00:20:37.834 "data_offset": 0, 00:20:37.834 "data_size": 0 00:20:37.834 }, 00:20:37.834 { 00:20:37.834 "name": "BaseBdev3", 00:20:37.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.834 "is_configured": false, 00:20:37.834 "data_offset": 0, 00:20:37.834 "data_size": 0 00:20:37.834 }, 00:20:37.834 { 00:20:37.834 "name": "BaseBdev4", 00:20:37.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.834 "is_configured": false, 00:20:37.834 "data_offset": 0, 00:20:37.834 "data_size": 0 00:20:37.834 } 00:20:37.834 ] 00:20:37.834 }' 00:20:37.834 10:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.834 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.400 10:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:38.658 [2024-07-15 10:28:15.742863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:38.658 BaseBdev2 00:20:38.658 10:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:38.658 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:38.658 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:38.658 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:38.658 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:38.658 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:38.658 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:38.917 10:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:38.917 [ 00:20:38.917 { 00:20:38.917 "name": "BaseBdev2", 00:20:38.917 "aliases": [ 00:20:38.917 "22164232-47bf-4620-a505-fa8fae6f94c9" 00:20:38.917 ], 00:20:38.917 "product_name": "Malloc disk", 00:20:38.917 "block_size": 512, 00:20:38.917 "num_blocks": 65536, 00:20:38.917 "uuid": "22164232-47bf-4620-a505-fa8fae6f94c9", 00:20:38.917 "assigned_rate_limits": { 00:20:38.917 "rw_ios_per_sec": 0, 00:20:38.917 "rw_mbytes_per_sec": 0, 00:20:38.917 "r_mbytes_per_sec": 0, 00:20:38.917 "w_mbytes_per_sec": 0 00:20:38.917 }, 00:20:38.917 "claimed": true, 00:20:38.917 "claim_type": "exclusive_write", 00:20:38.917 "zoned": false, 00:20:38.917 "supported_io_types": { 00:20:38.917 "read": true, 00:20:38.917 "write": true, 00:20:38.917 "unmap": true, 00:20:38.917 "flush": true, 00:20:38.917 "reset": true, 00:20:38.917 "nvme_admin": false, 00:20:38.917 "nvme_io": false, 00:20:38.917 "nvme_io_md": false, 00:20:38.917 "write_zeroes": true, 00:20:38.917 "zcopy": true, 00:20:38.917 "get_zone_info": false, 00:20:38.917 "zone_management": false, 00:20:38.917 "zone_append": false, 00:20:38.917 "compare": false, 00:20:38.917 "compare_and_write": false, 00:20:38.917 "abort": true, 00:20:38.917 "seek_hole": false, 00:20:38.917 "seek_data": false, 00:20:38.917 "copy": true, 00:20:38.917 "nvme_iov_md": false 00:20:38.917 }, 00:20:38.917 "memory_domains": [ 00:20:38.917 { 00:20:38.917 "dma_device_id": "system", 00:20:38.917 "dma_device_type": 1 00:20:38.917 }, 00:20:38.917 { 00:20:38.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.917 "dma_device_type": 2 00:20:38.917 } 00:20:38.917 ], 00:20:38.917 "driver_specific": {} 00:20:38.917 } 00:20:38.917 ] 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:38.917 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.175 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.175 "name": "Existed_Raid", 00:20:39.175 "uuid": "81ae4c11-eb1c-44ec-92e4-8ac61786eb20", 00:20:39.175 "strip_size_kb": 64, 00:20:39.175 "state": "configuring", 00:20:39.175 "raid_level": "concat", 00:20:39.175 "superblock": true, 00:20:39.175 "num_base_bdevs": 4, 00:20:39.175 "num_base_bdevs_discovered": 2, 00:20:39.175 "num_base_bdevs_operational": 4, 00:20:39.175 "base_bdevs_list": [ 00:20:39.175 { 00:20:39.175 "name": "BaseBdev1", 00:20:39.175 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:39.175 "is_configured": true, 00:20:39.175 "data_offset": 2048, 00:20:39.175 "data_size": 63488 00:20:39.175 }, 00:20:39.175 { 00:20:39.175 "name": "BaseBdev2", 00:20:39.175 "uuid": "22164232-47bf-4620-a505-fa8fae6f94c9", 00:20:39.175 "is_configured": true, 00:20:39.175 "data_offset": 2048, 00:20:39.175 "data_size": 63488 00:20:39.175 }, 00:20:39.175 { 00:20:39.175 "name": "BaseBdev3", 00:20:39.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.175 "is_configured": false, 00:20:39.175 "data_offset": 0, 00:20:39.175 "data_size": 0 00:20:39.175 }, 00:20:39.175 { 00:20:39.175 "name": "BaseBdev4", 00:20:39.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.175 "is_configured": false, 00:20:39.175 "data_offset": 0, 00:20:39.175 "data_size": 0 00:20:39.175 } 00:20:39.175 ] 00:20:39.175 }' 00:20:39.175 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.175 10:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:39.741 10:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:39.999 [2024-07-15 10:28:17.021724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:39.999 BaseBdev3 00:20:39.999 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:39.999 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:39.999 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:39.999 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:39.999 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:39.999 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:39.999 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:40.257 [ 00:20:40.257 { 00:20:40.257 "name": "BaseBdev3", 00:20:40.257 "aliases": [ 00:20:40.257 "4806fdeb-ea12-4ed0-b876-729bbcadf549" 00:20:40.257 ], 00:20:40.257 "product_name": "Malloc disk", 00:20:40.257 "block_size": 512, 00:20:40.257 "num_blocks": 65536, 00:20:40.257 "uuid": "4806fdeb-ea12-4ed0-b876-729bbcadf549", 00:20:40.257 "assigned_rate_limits": { 00:20:40.257 "rw_ios_per_sec": 0, 00:20:40.257 "rw_mbytes_per_sec": 0, 00:20:40.257 "r_mbytes_per_sec": 0, 00:20:40.257 "w_mbytes_per_sec": 0 00:20:40.257 }, 00:20:40.257 "claimed": true, 00:20:40.257 "claim_type": "exclusive_write", 00:20:40.257 "zoned": false, 00:20:40.257 "supported_io_types": { 00:20:40.257 "read": true, 00:20:40.257 "write": true, 00:20:40.257 "unmap": true, 00:20:40.257 "flush": true, 00:20:40.257 "reset": true, 00:20:40.257 "nvme_admin": false, 00:20:40.257 "nvme_io": false, 00:20:40.257 "nvme_io_md": false, 00:20:40.257 "write_zeroes": true, 00:20:40.257 "zcopy": true, 00:20:40.257 "get_zone_info": false, 00:20:40.257 "zone_management": false, 00:20:40.257 "zone_append": false, 00:20:40.257 "compare": false, 00:20:40.257 "compare_and_write": false, 00:20:40.257 "abort": true, 00:20:40.257 "seek_hole": false, 00:20:40.257 "seek_data": false, 00:20:40.257 "copy": true, 00:20:40.257 "nvme_iov_md": false 00:20:40.257 }, 00:20:40.257 "memory_domains": [ 00:20:40.257 { 00:20:40.257 "dma_device_id": "system", 00:20:40.257 "dma_device_type": 1 00:20:40.257 }, 00:20:40.257 { 00:20:40.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.257 "dma_device_type": 2 00:20:40.257 } 00:20:40.257 ], 00:20:40.257 "driver_specific": {} 00:20:40.257 } 00:20:40.257 ] 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.257 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:40.515 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.515 "name": "Existed_Raid", 00:20:40.515 "uuid": "81ae4c11-eb1c-44ec-92e4-8ac61786eb20", 00:20:40.515 "strip_size_kb": 64, 00:20:40.515 "state": "configuring", 00:20:40.515 "raid_level": "concat", 00:20:40.515 "superblock": true, 00:20:40.515 "num_base_bdevs": 4, 00:20:40.515 "num_base_bdevs_discovered": 3, 00:20:40.515 "num_base_bdevs_operational": 4, 00:20:40.515 "base_bdevs_list": [ 00:20:40.515 { 00:20:40.515 "name": "BaseBdev1", 00:20:40.515 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:40.515 "is_configured": true, 00:20:40.515 "data_offset": 2048, 00:20:40.515 "data_size": 63488 00:20:40.515 }, 00:20:40.515 { 00:20:40.515 "name": "BaseBdev2", 00:20:40.515 "uuid": "22164232-47bf-4620-a505-fa8fae6f94c9", 00:20:40.515 "is_configured": true, 00:20:40.515 "data_offset": 2048, 00:20:40.515 "data_size": 63488 00:20:40.515 }, 00:20:40.515 { 00:20:40.516 "name": "BaseBdev3", 00:20:40.516 "uuid": "4806fdeb-ea12-4ed0-b876-729bbcadf549", 00:20:40.516 "is_configured": true, 00:20:40.516 "data_offset": 2048, 00:20:40.516 "data_size": 63488 00:20:40.516 }, 00:20:40.516 { 00:20:40.516 "name": "BaseBdev4", 00:20:40.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.516 "is_configured": false, 00:20:40.516 "data_offset": 0, 00:20:40.516 "data_size": 0 00:20:40.516 } 00:20:40.516 ] 00:20:40.516 }' 00:20:40.516 10:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.516 10:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.081 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:41.339 [2024-07-15 10:28:18.432939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:41.339 [2024-07-15 10:28:18.433119] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bbc350 00:20:41.339 [2024-07-15 10:28:18.433134] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:41.339 [2024-07-15 10:28:18.433314] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bbc020 00:20:41.339 [2024-07-15 10:28:18.433439] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bbc350 00:20:41.339 [2024-07-15 10:28:18.433449] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bbc350 00:20:41.339 [2024-07-15 10:28:18.433539] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:41.339 BaseBdev4 00:20:41.339 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:41.339 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:41.339 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:41.339 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:41.339 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:41.339 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:41.339 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:41.597 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:41.597 [ 00:20:41.597 { 00:20:41.597 "name": "BaseBdev4", 00:20:41.597 "aliases": [ 00:20:41.597 "6c1f7fb4-e170-4eeb-b5b7-d064568af277" 00:20:41.597 ], 00:20:41.597 "product_name": "Malloc disk", 00:20:41.597 "block_size": 512, 00:20:41.597 "num_blocks": 65536, 00:20:41.597 "uuid": "6c1f7fb4-e170-4eeb-b5b7-d064568af277", 00:20:41.597 "assigned_rate_limits": { 00:20:41.597 "rw_ios_per_sec": 0, 00:20:41.597 "rw_mbytes_per_sec": 0, 00:20:41.597 "r_mbytes_per_sec": 0, 00:20:41.597 "w_mbytes_per_sec": 0 00:20:41.597 }, 00:20:41.597 "claimed": true, 00:20:41.597 "claim_type": "exclusive_write", 00:20:41.597 "zoned": false, 00:20:41.597 "supported_io_types": { 00:20:41.597 "read": true, 00:20:41.597 "write": true, 00:20:41.597 "unmap": true, 00:20:41.597 "flush": true, 00:20:41.597 "reset": true, 00:20:41.597 "nvme_admin": false, 00:20:41.597 "nvme_io": false, 00:20:41.597 "nvme_io_md": false, 00:20:41.597 "write_zeroes": true, 00:20:41.597 "zcopy": true, 00:20:41.597 "get_zone_info": false, 00:20:41.597 "zone_management": false, 00:20:41.597 "zone_append": false, 00:20:41.597 "compare": false, 00:20:41.597 "compare_and_write": false, 00:20:41.597 "abort": true, 00:20:41.597 "seek_hole": false, 00:20:41.597 "seek_data": false, 00:20:41.597 "copy": true, 00:20:41.598 "nvme_iov_md": false 00:20:41.598 }, 00:20:41.598 "memory_domains": [ 00:20:41.598 { 00:20:41.598 "dma_device_id": "system", 00:20:41.598 "dma_device_type": 1 00:20:41.598 }, 00:20:41.598 { 00:20:41.598 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.598 "dma_device_type": 2 00:20:41.598 } 00:20:41.598 ], 00:20:41.598 "driver_specific": {} 00:20:41.598 } 00:20:41.598 ] 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.855 10:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:42.119 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.120 "name": "Existed_Raid", 00:20:42.120 "uuid": "81ae4c11-eb1c-44ec-92e4-8ac61786eb20", 00:20:42.120 "strip_size_kb": 64, 00:20:42.120 "state": "online", 00:20:42.120 "raid_level": "concat", 00:20:42.120 "superblock": true, 00:20:42.120 "num_base_bdevs": 4, 00:20:42.120 "num_base_bdevs_discovered": 4, 00:20:42.120 "num_base_bdevs_operational": 4, 00:20:42.120 "base_bdevs_list": [ 00:20:42.120 { 00:20:42.120 "name": "BaseBdev1", 00:20:42.120 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:42.120 "is_configured": true, 00:20:42.120 "data_offset": 2048, 00:20:42.120 "data_size": 63488 00:20:42.120 }, 00:20:42.120 { 00:20:42.120 "name": "BaseBdev2", 00:20:42.120 "uuid": "22164232-47bf-4620-a505-fa8fae6f94c9", 00:20:42.120 "is_configured": true, 00:20:42.120 "data_offset": 2048, 00:20:42.120 "data_size": 63488 00:20:42.120 }, 00:20:42.120 { 00:20:42.120 "name": "BaseBdev3", 00:20:42.120 "uuid": "4806fdeb-ea12-4ed0-b876-729bbcadf549", 00:20:42.120 "is_configured": true, 00:20:42.120 "data_offset": 2048, 00:20:42.120 "data_size": 63488 00:20:42.120 }, 00:20:42.120 { 00:20:42.120 "name": "BaseBdev4", 00:20:42.120 "uuid": "6c1f7fb4-e170-4eeb-b5b7-d064568af277", 00:20:42.120 "is_configured": true, 00:20:42.120 "data_offset": 2048, 00:20:42.120 "data_size": 63488 00:20:42.120 } 00:20:42.120 ] 00:20:42.120 }' 00:20:42.120 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.120 10:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:42.688 [2024-07-15 10:28:19.853053] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:42.688 "name": "Existed_Raid", 00:20:42.688 "aliases": [ 00:20:42.688 "81ae4c11-eb1c-44ec-92e4-8ac61786eb20" 00:20:42.688 ], 00:20:42.688 "product_name": "Raid Volume", 00:20:42.688 "block_size": 512, 00:20:42.688 "num_blocks": 253952, 00:20:42.688 "uuid": "81ae4c11-eb1c-44ec-92e4-8ac61786eb20", 00:20:42.688 "assigned_rate_limits": { 00:20:42.688 "rw_ios_per_sec": 0, 00:20:42.688 "rw_mbytes_per_sec": 0, 00:20:42.688 "r_mbytes_per_sec": 0, 00:20:42.688 "w_mbytes_per_sec": 0 00:20:42.688 }, 00:20:42.688 "claimed": false, 00:20:42.688 "zoned": false, 00:20:42.688 "supported_io_types": { 00:20:42.688 "read": true, 00:20:42.688 "write": true, 00:20:42.688 "unmap": true, 00:20:42.688 "flush": true, 00:20:42.688 "reset": true, 00:20:42.688 "nvme_admin": false, 00:20:42.688 "nvme_io": false, 00:20:42.688 "nvme_io_md": false, 00:20:42.688 "write_zeroes": true, 00:20:42.688 "zcopy": false, 00:20:42.688 "get_zone_info": false, 00:20:42.688 "zone_management": false, 00:20:42.688 "zone_append": false, 00:20:42.688 "compare": false, 00:20:42.688 "compare_and_write": false, 00:20:42.688 "abort": false, 00:20:42.688 "seek_hole": false, 00:20:42.688 "seek_data": false, 00:20:42.688 "copy": false, 00:20:42.688 "nvme_iov_md": false 00:20:42.688 }, 00:20:42.688 "memory_domains": [ 00:20:42.688 { 00:20:42.688 "dma_device_id": "system", 00:20:42.688 "dma_device_type": 1 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.688 "dma_device_type": 2 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "dma_device_id": "system", 00:20:42.688 "dma_device_type": 1 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.688 "dma_device_type": 2 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "dma_device_id": "system", 00:20:42.688 "dma_device_type": 1 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.688 "dma_device_type": 2 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "dma_device_id": "system", 00:20:42.688 "dma_device_type": 1 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.688 "dma_device_type": 2 00:20:42.688 } 00:20:42.688 ], 00:20:42.688 "driver_specific": { 00:20:42.688 "raid": { 00:20:42.688 "uuid": "81ae4c11-eb1c-44ec-92e4-8ac61786eb20", 00:20:42.688 "strip_size_kb": 64, 00:20:42.688 "state": "online", 00:20:42.688 "raid_level": "concat", 00:20:42.688 "superblock": true, 00:20:42.688 "num_base_bdevs": 4, 00:20:42.688 "num_base_bdevs_discovered": 4, 00:20:42.688 "num_base_bdevs_operational": 4, 00:20:42.688 "base_bdevs_list": [ 00:20:42.688 { 00:20:42.688 "name": "BaseBdev1", 00:20:42.688 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:42.688 "is_configured": true, 00:20:42.688 "data_offset": 2048, 00:20:42.688 "data_size": 63488 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "name": "BaseBdev2", 00:20:42.688 "uuid": "22164232-47bf-4620-a505-fa8fae6f94c9", 00:20:42.688 "is_configured": true, 00:20:42.688 "data_offset": 2048, 00:20:42.688 "data_size": 63488 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "name": "BaseBdev3", 00:20:42.688 "uuid": "4806fdeb-ea12-4ed0-b876-729bbcadf549", 00:20:42.688 "is_configured": true, 00:20:42.688 "data_offset": 2048, 00:20:42.688 "data_size": 63488 00:20:42.688 }, 00:20:42.688 { 00:20:42.688 "name": "BaseBdev4", 00:20:42.688 "uuid": "6c1f7fb4-e170-4eeb-b5b7-d064568af277", 00:20:42.688 "is_configured": true, 00:20:42.688 "data_offset": 2048, 00:20:42.688 "data_size": 63488 00:20:42.688 } 00:20:42.688 ] 00:20:42.688 } 00:20:42.688 } 00:20:42.688 }' 00:20:42.688 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:42.945 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:42.945 BaseBdev2 00:20:42.945 BaseBdev3 00:20:42.945 BaseBdev4' 00:20:42.945 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:42.945 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:42.945 10:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:43.202 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:43.202 "name": "BaseBdev1", 00:20:43.202 "aliases": [ 00:20:43.202 "b07d44a1-595c-412d-acf0-101f8f82d2fb" 00:20:43.202 ], 00:20:43.202 "product_name": "Malloc disk", 00:20:43.202 "block_size": 512, 00:20:43.202 "num_blocks": 65536, 00:20:43.202 "uuid": "b07d44a1-595c-412d-acf0-101f8f82d2fb", 00:20:43.202 "assigned_rate_limits": { 00:20:43.202 "rw_ios_per_sec": 0, 00:20:43.202 "rw_mbytes_per_sec": 0, 00:20:43.202 "r_mbytes_per_sec": 0, 00:20:43.202 "w_mbytes_per_sec": 0 00:20:43.202 }, 00:20:43.202 "claimed": true, 00:20:43.202 "claim_type": "exclusive_write", 00:20:43.202 "zoned": false, 00:20:43.202 "supported_io_types": { 00:20:43.202 "read": true, 00:20:43.202 "write": true, 00:20:43.202 "unmap": true, 00:20:43.202 "flush": true, 00:20:43.202 "reset": true, 00:20:43.202 "nvme_admin": false, 00:20:43.202 "nvme_io": false, 00:20:43.202 "nvme_io_md": false, 00:20:43.202 "write_zeroes": true, 00:20:43.202 "zcopy": true, 00:20:43.202 "get_zone_info": false, 00:20:43.202 "zone_management": false, 00:20:43.202 "zone_append": false, 00:20:43.202 "compare": false, 00:20:43.202 "compare_and_write": false, 00:20:43.202 "abort": true, 00:20:43.202 "seek_hole": false, 00:20:43.202 "seek_data": false, 00:20:43.202 "copy": true, 00:20:43.202 "nvme_iov_md": false 00:20:43.202 }, 00:20:43.202 "memory_domains": [ 00:20:43.202 { 00:20:43.202 "dma_device_id": "system", 00:20:43.202 "dma_device_type": 1 00:20:43.202 }, 00:20:43.202 { 00:20:43.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.202 "dma_device_type": 2 00:20:43.202 } 00:20:43.202 ], 00:20:43.202 "driver_specific": {} 00:20:43.202 }' 00:20:43.203 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.203 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.203 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:43.203 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.203 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.203 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:43.203 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:43.460 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:43.718 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:43.718 "name": "BaseBdev2", 00:20:43.718 "aliases": [ 00:20:43.718 "22164232-47bf-4620-a505-fa8fae6f94c9" 00:20:43.718 ], 00:20:43.718 "product_name": "Malloc disk", 00:20:43.718 "block_size": 512, 00:20:43.718 "num_blocks": 65536, 00:20:43.718 "uuid": "22164232-47bf-4620-a505-fa8fae6f94c9", 00:20:43.718 "assigned_rate_limits": { 00:20:43.718 "rw_ios_per_sec": 0, 00:20:43.718 "rw_mbytes_per_sec": 0, 00:20:43.718 "r_mbytes_per_sec": 0, 00:20:43.718 "w_mbytes_per_sec": 0 00:20:43.718 }, 00:20:43.718 "claimed": true, 00:20:43.718 "claim_type": "exclusive_write", 00:20:43.718 "zoned": false, 00:20:43.718 "supported_io_types": { 00:20:43.718 "read": true, 00:20:43.718 "write": true, 00:20:43.718 "unmap": true, 00:20:43.718 "flush": true, 00:20:43.718 "reset": true, 00:20:43.718 "nvme_admin": false, 00:20:43.718 "nvme_io": false, 00:20:43.718 "nvme_io_md": false, 00:20:43.718 "write_zeroes": true, 00:20:43.718 "zcopy": true, 00:20:43.718 "get_zone_info": false, 00:20:43.718 "zone_management": false, 00:20:43.718 "zone_append": false, 00:20:43.718 "compare": false, 00:20:43.718 "compare_and_write": false, 00:20:43.718 "abort": true, 00:20:43.718 "seek_hole": false, 00:20:43.718 "seek_data": false, 00:20:43.718 "copy": true, 00:20:43.718 "nvme_iov_md": false 00:20:43.718 }, 00:20:43.718 "memory_domains": [ 00:20:43.718 { 00:20:43.718 "dma_device_id": "system", 00:20:43.718 "dma_device_type": 1 00:20:43.718 }, 00:20:43.718 { 00:20:43.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.718 "dma_device_type": 2 00:20:43.718 } 00:20:43.718 ], 00:20:43.718 "driver_specific": {} 00:20:43.718 }' 00:20:43.718 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.718 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.718 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:43.718 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.975 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.975 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:43.975 10:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:43.975 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:44.233 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:44.233 "name": "BaseBdev3", 00:20:44.233 "aliases": [ 00:20:44.233 "4806fdeb-ea12-4ed0-b876-729bbcadf549" 00:20:44.233 ], 00:20:44.233 "product_name": "Malloc disk", 00:20:44.233 "block_size": 512, 00:20:44.233 "num_blocks": 65536, 00:20:44.233 "uuid": "4806fdeb-ea12-4ed0-b876-729bbcadf549", 00:20:44.233 "assigned_rate_limits": { 00:20:44.233 "rw_ios_per_sec": 0, 00:20:44.233 "rw_mbytes_per_sec": 0, 00:20:44.233 "r_mbytes_per_sec": 0, 00:20:44.233 "w_mbytes_per_sec": 0 00:20:44.233 }, 00:20:44.233 "claimed": true, 00:20:44.233 "claim_type": "exclusive_write", 00:20:44.233 "zoned": false, 00:20:44.233 "supported_io_types": { 00:20:44.233 "read": true, 00:20:44.233 "write": true, 00:20:44.233 "unmap": true, 00:20:44.233 "flush": true, 00:20:44.233 "reset": true, 00:20:44.233 "nvme_admin": false, 00:20:44.233 "nvme_io": false, 00:20:44.233 "nvme_io_md": false, 00:20:44.233 "write_zeroes": true, 00:20:44.233 "zcopy": true, 00:20:44.233 "get_zone_info": false, 00:20:44.233 "zone_management": false, 00:20:44.233 "zone_append": false, 00:20:44.233 "compare": false, 00:20:44.233 "compare_and_write": false, 00:20:44.233 "abort": true, 00:20:44.233 "seek_hole": false, 00:20:44.233 "seek_data": false, 00:20:44.233 "copy": true, 00:20:44.233 "nvme_iov_md": false 00:20:44.233 }, 00:20:44.233 "memory_domains": [ 00:20:44.233 { 00:20:44.233 "dma_device_id": "system", 00:20:44.233 "dma_device_type": 1 00:20:44.233 }, 00:20:44.233 { 00:20:44.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.233 "dma_device_type": 2 00:20:44.233 } 00:20:44.233 ], 00:20:44.233 "driver_specific": {} 00:20:44.233 }' 00:20:44.233 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.233 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.233 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:44.233 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.233 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:44.491 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:44.749 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:44.749 "name": "BaseBdev4", 00:20:44.749 "aliases": [ 00:20:44.749 "6c1f7fb4-e170-4eeb-b5b7-d064568af277" 00:20:44.749 ], 00:20:44.749 "product_name": "Malloc disk", 00:20:44.749 "block_size": 512, 00:20:44.749 "num_blocks": 65536, 00:20:44.749 "uuid": "6c1f7fb4-e170-4eeb-b5b7-d064568af277", 00:20:44.749 "assigned_rate_limits": { 00:20:44.749 "rw_ios_per_sec": 0, 00:20:44.749 "rw_mbytes_per_sec": 0, 00:20:44.749 "r_mbytes_per_sec": 0, 00:20:44.749 "w_mbytes_per_sec": 0 00:20:44.749 }, 00:20:44.749 "claimed": true, 00:20:44.749 "claim_type": "exclusive_write", 00:20:44.749 "zoned": false, 00:20:44.749 "supported_io_types": { 00:20:44.749 "read": true, 00:20:44.749 "write": true, 00:20:44.749 "unmap": true, 00:20:44.749 "flush": true, 00:20:44.749 "reset": true, 00:20:44.749 "nvme_admin": false, 00:20:44.749 "nvme_io": false, 00:20:44.749 "nvme_io_md": false, 00:20:44.749 "write_zeroes": true, 00:20:44.749 "zcopy": true, 00:20:44.749 "get_zone_info": false, 00:20:44.749 "zone_management": false, 00:20:44.749 "zone_append": false, 00:20:44.749 "compare": false, 00:20:44.749 "compare_and_write": false, 00:20:44.749 "abort": true, 00:20:44.749 "seek_hole": false, 00:20:44.749 "seek_data": false, 00:20:44.749 "copy": true, 00:20:44.749 "nvme_iov_md": false 00:20:44.749 }, 00:20:44.749 "memory_domains": [ 00:20:44.749 { 00:20:44.749 "dma_device_id": "system", 00:20:44.749 "dma_device_type": 1 00:20:44.749 }, 00:20:44.749 { 00:20:44.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.749 "dma_device_type": 2 00:20:44.749 } 00:20:44.749 ], 00:20:44.749 "driver_specific": {} 00:20:44.749 }' 00:20:44.749 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.749 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:45.007 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:45.007 10:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.007 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:45.007 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:45.007 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.007 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:45.007 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:45.007 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.007 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:45.264 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:45.264 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:45.264 [2024-07-15 10:28:22.451648] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:45.264 [2024-07-15 10:28:22.451679] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:45.264 [2024-07-15 10:28:22.451730] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.522 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:45.781 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:45.781 "name": "Existed_Raid", 00:20:45.781 "uuid": "81ae4c11-eb1c-44ec-92e4-8ac61786eb20", 00:20:45.781 "strip_size_kb": 64, 00:20:45.781 "state": "offline", 00:20:45.781 "raid_level": "concat", 00:20:45.781 "superblock": true, 00:20:45.781 "num_base_bdevs": 4, 00:20:45.781 "num_base_bdevs_discovered": 3, 00:20:45.781 "num_base_bdevs_operational": 3, 00:20:45.781 "base_bdevs_list": [ 00:20:45.781 { 00:20:45.781 "name": null, 00:20:45.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.781 "is_configured": false, 00:20:45.781 "data_offset": 2048, 00:20:45.781 "data_size": 63488 00:20:45.781 }, 00:20:45.781 { 00:20:45.781 "name": "BaseBdev2", 00:20:45.781 "uuid": "22164232-47bf-4620-a505-fa8fae6f94c9", 00:20:45.781 "is_configured": true, 00:20:45.781 "data_offset": 2048, 00:20:45.781 "data_size": 63488 00:20:45.781 }, 00:20:45.781 { 00:20:45.781 "name": "BaseBdev3", 00:20:45.781 "uuid": "4806fdeb-ea12-4ed0-b876-729bbcadf549", 00:20:45.781 "is_configured": true, 00:20:45.781 "data_offset": 2048, 00:20:45.781 "data_size": 63488 00:20:45.781 }, 00:20:45.781 { 00:20:45.781 "name": "BaseBdev4", 00:20:45.781 "uuid": "6c1f7fb4-e170-4eeb-b5b7-d064568af277", 00:20:45.781 "is_configured": true, 00:20:45.781 "data_offset": 2048, 00:20:45.781 "data_size": 63488 00:20:45.781 } 00:20:45.781 ] 00:20:45.781 }' 00:20:45.781 10:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:45.781 10:28:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:46.346 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:46.346 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:46.346 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.346 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:46.604 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:46.604 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:46.604 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:46.861 [2024-07-15 10:28:23.809186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:46.861 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:46.861 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:46.861 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.861 10:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:47.119 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:47.119 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:47.119 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:47.119 [2024-07-15 10:28:24.310965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:47.377 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:47.377 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:47.377 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.377 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:47.634 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:47.634 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:47.634 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:47.634 [2024-07-15 10:28:24.816763] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:47.634 [2024-07-15 10:28:24.816814] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbc350 name Existed_Raid, state offline 00:20:47.892 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:47.892 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:47.892 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.892 10:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:48.150 BaseBdev2 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:48.150 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:48.408 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:48.666 [ 00:20:48.666 { 00:20:48.666 "name": "BaseBdev2", 00:20:48.666 "aliases": [ 00:20:48.666 "c6d575b3-c9eb-4e52-bc44-669a33be3513" 00:20:48.666 ], 00:20:48.666 "product_name": "Malloc disk", 00:20:48.666 "block_size": 512, 00:20:48.666 "num_blocks": 65536, 00:20:48.666 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:48.666 "assigned_rate_limits": { 00:20:48.666 "rw_ios_per_sec": 0, 00:20:48.666 "rw_mbytes_per_sec": 0, 00:20:48.666 "r_mbytes_per_sec": 0, 00:20:48.666 "w_mbytes_per_sec": 0 00:20:48.666 }, 00:20:48.666 "claimed": false, 00:20:48.666 "zoned": false, 00:20:48.666 "supported_io_types": { 00:20:48.666 "read": true, 00:20:48.666 "write": true, 00:20:48.666 "unmap": true, 00:20:48.666 "flush": true, 00:20:48.666 "reset": true, 00:20:48.666 "nvme_admin": false, 00:20:48.666 "nvme_io": false, 00:20:48.666 "nvme_io_md": false, 00:20:48.666 "write_zeroes": true, 00:20:48.666 "zcopy": true, 00:20:48.666 "get_zone_info": false, 00:20:48.666 "zone_management": false, 00:20:48.666 "zone_append": false, 00:20:48.666 "compare": false, 00:20:48.666 "compare_and_write": false, 00:20:48.666 "abort": true, 00:20:48.666 "seek_hole": false, 00:20:48.666 "seek_data": false, 00:20:48.666 "copy": true, 00:20:48.666 "nvme_iov_md": false 00:20:48.666 }, 00:20:48.666 "memory_domains": [ 00:20:48.666 { 00:20:48.666 "dma_device_id": "system", 00:20:48.666 "dma_device_type": 1 00:20:48.666 }, 00:20:48.666 { 00:20:48.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:48.666 "dma_device_type": 2 00:20:48.666 } 00:20:48.666 ], 00:20:48.666 "driver_specific": {} 00:20:48.666 } 00:20:48.666 ] 00:20:48.666 10:28:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:48.666 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:48.666 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:48.666 10:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:48.924 BaseBdev3 00:20:48.924 10:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:48.924 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:48.924 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:48.924 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:48.924 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:48.924 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:48.924 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:49.182 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:49.440 [ 00:20:49.440 { 00:20:49.440 "name": "BaseBdev3", 00:20:49.440 "aliases": [ 00:20:49.440 "e1fc817d-1d8e-4608-9922-bb1a4d3ac699" 00:20:49.440 ], 00:20:49.440 "product_name": "Malloc disk", 00:20:49.440 "block_size": 512, 00:20:49.440 "num_blocks": 65536, 00:20:49.440 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:49.440 "assigned_rate_limits": { 00:20:49.440 "rw_ios_per_sec": 0, 00:20:49.440 "rw_mbytes_per_sec": 0, 00:20:49.440 "r_mbytes_per_sec": 0, 00:20:49.440 "w_mbytes_per_sec": 0 00:20:49.440 }, 00:20:49.440 "claimed": false, 00:20:49.440 "zoned": false, 00:20:49.440 "supported_io_types": { 00:20:49.440 "read": true, 00:20:49.440 "write": true, 00:20:49.440 "unmap": true, 00:20:49.440 "flush": true, 00:20:49.440 "reset": true, 00:20:49.440 "nvme_admin": false, 00:20:49.440 "nvme_io": false, 00:20:49.440 "nvme_io_md": false, 00:20:49.440 "write_zeroes": true, 00:20:49.440 "zcopy": true, 00:20:49.440 "get_zone_info": false, 00:20:49.440 "zone_management": false, 00:20:49.440 "zone_append": false, 00:20:49.440 "compare": false, 00:20:49.440 "compare_and_write": false, 00:20:49.440 "abort": true, 00:20:49.440 "seek_hole": false, 00:20:49.440 "seek_data": false, 00:20:49.440 "copy": true, 00:20:49.440 "nvme_iov_md": false 00:20:49.440 }, 00:20:49.440 "memory_domains": [ 00:20:49.440 { 00:20:49.440 "dma_device_id": "system", 00:20:49.440 "dma_device_type": 1 00:20:49.440 }, 00:20:49.440 { 00:20:49.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.440 "dma_device_type": 2 00:20:49.440 } 00:20:49.440 ], 00:20:49.440 "driver_specific": {} 00:20:49.440 } 00:20:49.440 ] 00:20:49.440 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:49.440 10:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:49.440 10:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:49.440 10:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:49.698 BaseBdev4 00:20:49.698 10:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:49.698 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:49.698 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:49.698 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:49.698 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:49.698 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:49.698 10:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:49.987 10:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:50.245 [ 00:20:50.245 { 00:20:50.245 "name": "BaseBdev4", 00:20:50.245 "aliases": [ 00:20:50.245 "5a8697d1-0257-4b73-bd18-fdacc90d04d6" 00:20:50.245 ], 00:20:50.245 "product_name": "Malloc disk", 00:20:50.245 "block_size": 512, 00:20:50.245 "num_blocks": 65536, 00:20:50.245 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:50.245 "assigned_rate_limits": { 00:20:50.245 "rw_ios_per_sec": 0, 00:20:50.245 "rw_mbytes_per_sec": 0, 00:20:50.245 "r_mbytes_per_sec": 0, 00:20:50.245 "w_mbytes_per_sec": 0 00:20:50.245 }, 00:20:50.245 "claimed": false, 00:20:50.245 "zoned": false, 00:20:50.245 "supported_io_types": { 00:20:50.245 "read": true, 00:20:50.245 "write": true, 00:20:50.245 "unmap": true, 00:20:50.245 "flush": true, 00:20:50.245 "reset": true, 00:20:50.245 "nvme_admin": false, 00:20:50.245 "nvme_io": false, 00:20:50.245 "nvme_io_md": false, 00:20:50.245 "write_zeroes": true, 00:20:50.245 "zcopy": true, 00:20:50.245 "get_zone_info": false, 00:20:50.245 "zone_management": false, 00:20:50.245 "zone_append": false, 00:20:50.245 "compare": false, 00:20:50.245 "compare_and_write": false, 00:20:50.245 "abort": true, 00:20:50.245 "seek_hole": false, 00:20:50.245 "seek_data": false, 00:20:50.245 "copy": true, 00:20:50.245 "nvme_iov_md": false 00:20:50.245 }, 00:20:50.245 "memory_domains": [ 00:20:50.245 { 00:20:50.245 "dma_device_id": "system", 00:20:50.245 "dma_device_type": 1 00:20:50.245 }, 00:20:50.245 { 00:20:50.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:50.245 "dma_device_type": 2 00:20:50.245 } 00:20:50.245 ], 00:20:50.245 "driver_specific": {} 00:20:50.245 } 00:20:50.245 ] 00:20:50.245 10:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:50.245 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:50.245 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:50.245 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:50.503 [2024-07-15 10:28:27.529242] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:50.503 [2024-07-15 10:28:27.529287] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:50.503 [2024-07-15 10:28:27.529308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:50.503 [2024-07-15 10:28:27.530675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:50.503 [2024-07-15 10:28:27.530718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.503 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.504 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.504 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.504 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.504 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:50.762 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.762 "name": "Existed_Raid", 00:20:50.762 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:20:50.762 "strip_size_kb": 64, 00:20:50.762 "state": "configuring", 00:20:50.762 "raid_level": "concat", 00:20:50.762 "superblock": true, 00:20:50.762 "num_base_bdevs": 4, 00:20:50.762 "num_base_bdevs_discovered": 3, 00:20:50.762 "num_base_bdevs_operational": 4, 00:20:50.762 "base_bdevs_list": [ 00:20:50.762 { 00:20:50.762 "name": "BaseBdev1", 00:20:50.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.762 "is_configured": false, 00:20:50.762 "data_offset": 0, 00:20:50.762 "data_size": 0 00:20:50.762 }, 00:20:50.762 { 00:20:50.762 "name": "BaseBdev2", 00:20:50.762 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:50.762 "is_configured": true, 00:20:50.762 "data_offset": 2048, 00:20:50.762 "data_size": 63488 00:20:50.762 }, 00:20:50.762 { 00:20:50.762 "name": "BaseBdev3", 00:20:50.762 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:50.762 "is_configured": true, 00:20:50.762 "data_offset": 2048, 00:20:50.762 "data_size": 63488 00:20:50.762 }, 00:20:50.762 { 00:20:50.762 "name": "BaseBdev4", 00:20:50.762 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:50.762 "is_configured": true, 00:20:50.762 "data_offset": 2048, 00:20:50.762 "data_size": 63488 00:20:50.762 } 00:20:50.762 ] 00:20:50.762 }' 00:20:50.762 10:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.762 10:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:51.328 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:51.586 [2024-07-15 10:28:28.608078] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.586 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:51.844 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.844 "name": "Existed_Raid", 00:20:51.844 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:20:51.844 "strip_size_kb": 64, 00:20:51.844 "state": "configuring", 00:20:51.844 "raid_level": "concat", 00:20:51.844 "superblock": true, 00:20:51.844 "num_base_bdevs": 4, 00:20:51.844 "num_base_bdevs_discovered": 2, 00:20:51.844 "num_base_bdevs_operational": 4, 00:20:51.844 "base_bdevs_list": [ 00:20:51.844 { 00:20:51.844 "name": "BaseBdev1", 00:20:51.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.844 "is_configured": false, 00:20:51.844 "data_offset": 0, 00:20:51.844 "data_size": 0 00:20:51.844 }, 00:20:51.844 { 00:20:51.844 "name": null, 00:20:51.844 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:51.844 "is_configured": false, 00:20:51.844 "data_offset": 2048, 00:20:51.844 "data_size": 63488 00:20:51.844 }, 00:20:51.844 { 00:20:51.844 "name": "BaseBdev3", 00:20:51.844 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:51.844 "is_configured": true, 00:20:51.844 "data_offset": 2048, 00:20:51.844 "data_size": 63488 00:20:51.844 }, 00:20:51.844 { 00:20:51.844 "name": "BaseBdev4", 00:20:51.844 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:51.844 "is_configured": true, 00:20:51.844 "data_offset": 2048, 00:20:51.844 "data_size": 63488 00:20:51.844 } 00:20:51.844 ] 00:20:51.844 }' 00:20:51.844 10:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.844 10:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:52.410 10:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.410 10:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:52.668 10:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:52.669 10:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:52.927 [2024-07-15 10:28:29.988336] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:52.927 BaseBdev1 00:20:52.927 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:52.927 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:52.927 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:52.927 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:52.927 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:52.927 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:52.927 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:53.185 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:53.443 [ 00:20:53.443 { 00:20:53.443 "name": "BaseBdev1", 00:20:53.443 "aliases": [ 00:20:53.443 "53a3812b-836d-488c-a9b4-6c283de1cb05" 00:20:53.443 ], 00:20:53.443 "product_name": "Malloc disk", 00:20:53.443 "block_size": 512, 00:20:53.443 "num_blocks": 65536, 00:20:53.443 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:20:53.443 "assigned_rate_limits": { 00:20:53.443 "rw_ios_per_sec": 0, 00:20:53.443 "rw_mbytes_per_sec": 0, 00:20:53.443 "r_mbytes_per_sec": 0, 00:20:53.443 "w_mbytes_per_sec": 0 00:20:53.443 }, 00:20:53.443 "claimed": true, 00:20:53.443 "claim_type": "exclusive_write", 00:20:53.443 "zoned": false, 00:20:53.443 "supported_io_types": { 00:20:53.443 "read": true, 00:20:53.443 "write": true, 00:20:53.443 "unmap": true, 00:20:53.443 "flush": true, 00:20:53.443 "reset": true, 00:20:53.443 "nvme_admin": false, 00:20:53.443 "nvme_io": false, 00:20:53.443 "nvme_io_md": false, 00:20:53.443 "write_zeroes": true, 00:20:53.443 "zcopy": true, 00:20:53.443 "get_zone_info": false, 00:20:53.443 "zone_management": false, 00:20:53.443 "zone_append": false, 00:20:53.443 "compare": false, 00:20:53.443 "compare_and_write": false, 00:20:53.443 "abort": true, 00:20:53.443 "seek_hole": false, 00:20:53.443 "seek_data": false, 00:20:53.443 "copy": true, 00:20:53.443 "nvme_iov_md": false 00:20:53.443 }, 00:20:53.443 "memory_domains": [ 00:20:53.443 { 00:20:53.443 "dma_device_id": "system", 00:20:53.443 "dma_device_type": 1 00:20:53.443 }, 00:20:53.443 { 00:20:53.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.443 "dma_device_type": 2 00:20:53.443 } 00:20:53.443 ], 00:20:53.443 "driver_specific": {} 00:20:53.443 } 00:20:53.443 ] 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.443 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.701 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.701 "name": "Existed_Raid", 00:20:53.701 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:20:53.701 "strip_size_kb": 64, 00:20:53.701 "state": "configuring", 00:20:53.701 "raid_level": "concat", 00:20:53.701 "superblock": true, 00:20:53.701 "num_base_bdevs": 4, 00:20:53.701 "num_base_bdevs_discovered": 3, 00:20:53.701 "num_base_bdevs_operational": 4, 00:20:53.701 "base_bdevs_list": [ 00:20:53.701 { 00:20:53.701 "name": "BaseBdev1", 00:20:53.701 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:20:53.701 "is_configured": true, 00:20:53.701 "data_offset": 2048, 00:20:53.701 "data_size": 63488 00:20:53.701 }, 00:20:53.701 { 00:20:53.701 "name": null, 00:20:53.701 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:53.701 "is_configured": false, 00:20:53.701 "data_offset": 2048, 00:20:53.701 "data_size": 63488 00:20:53.701 }, 00:20:53.701 { 00:20:53.701 "name": "BaseBdev3", 00:20:53.701 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:53.701 "is_configured": true, 00:20:53.701 "data_offset": 2048, 00:20:53.701 "data_size": 63488 00:20:53.701 }, 00:20:53.701 { 00:20:53.701 "name": "BaseBdev4", 00:20:53.701 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:53.701 "is_configured": true, 00:20:53.701 "data_offset": 2048, 00:20:53.701 "data_size": 63488 00:20:53.701 } 00:20:53.701 ] 00:20:53.701 }' 00:20:53.701 10:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.701 10:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:54.266 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.266 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:54.524 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:54.524 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:54.781 [2024-07-15 10:28:31.837273] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.781 10:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.037 10:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.037 "name": "Existed_Raid", 00:20:55.037 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:20:55.037 "strip_size_kb": 64, 00:20:55.037 "state": "configuring", 00:20:55.037 "raid_level": "concat", 00:20:55.037 "superblock": true, 00:20:55.037 "num_base_bdevs": 4, 00:20:55.037 "num_base_bdevs_discovered": 2, 00:20:55.037 "num_base_bdevs_operational": 4, 00:20:55.037 "base_bdevs_list": [ 00:20:55.037 { 00:20:55.037 "name": "BaseBdev1", 00:20:55.037 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:20:55.037 "is_configured": true, 00:20:55.037 "data_offset": 2048, 00:20:55.037 "data_size": 63488 00:20:55.037 }, 00:20:55.037 { 00:20:55.037 "name": null, 00:20:55.037 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:55.037 "is_configured": false, 00:20:55.037 "data_offset": 2048, 00:20:55.037 "data_size": 63488 00:20:55.037 }, 00:20:55.037 { 00:20:55.037 "name": null, 00:20:55.037 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:55.037 "is_configured": false, 00:20:55.037 "data_offset": 2048, 00:20:55.037 "data_size": 63488 00:20:55.037 }, 00:20:55.037 { 00:20:55.037 "name": "BaseBdev4", 00:20:55.037 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:55.037 "is_configured": true, 00:20:55.037 "data_offset": 2048, 00:20:55.037 "data_size": 63488 00:20:55.037 } 00:20:55.037 ] 00:20:55.037 }' 00:20:55.037 10:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.037 10:28:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:55.599 10:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.599 10:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:55.856 10:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:55.856 10:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:56.114 [2024-07-15 10:28:33.136732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.114 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:56.372 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.372 "name": "Existed_Raid", 00:20:56.372 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:20:56.372 "strip_size_kb": 64, 00:20:56.372 "state": "configuring", 00:20:56.372 "raid_level": "concat", 00:20:56.372 "superblock": true, 00:20:56.372 "num_base_bdevs": 4, 00:20:56.372 "num_base_bdevs_discovered": 3, 00:20:56.372 "num_base_bdevs_operational": 4, 00:20:56.372 "base_bdevs_list": [ 00:20:56.372 { 00:20:56.372 "name": "BaseBdev1", 00:20:56.372 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:20:56.372 "is_configured": true, 00:20:56.372 "data_offset": 2048, 00:20:56.372 "data_size": 63488 00:20:56.372 }, 00:20:56.372 { 00:20:56.372 "name": null, 00:20:56.372 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:56.372 "is_configured": false, 00:20:56.372 "data_offset": 2048, 00:20:56.372 "data_size": 63488 00:20:56.372 }, 00:20:56.372 { 00:20:56.372 "name": "BaseBdev3", 00:20:56.372 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:56.372 "is_configured": true, 00:20:56.372 "data_offset": 2048, 00:20:56.372 "data_size": 63488 00:20:56.372 }, 00:20:56.372 { 00:20:56.372 "name": "BaseBdev4", 00:20:56.372 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:56.372 "is_configured": true, 00:20:56.372 "data_offset": 2048, 00:20:56.372 "data_size": 63488 00:20:56.372 } 00:20:56.372 ] 00:20:56.372 }' 00:20:56.372 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.372 10:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:56.937 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.937 10:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:57.194 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:57.194 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:57.453 [2024-07-15 10:28:34.408109] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.453 "name": "Existed_Raid", 00:20:57.453 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:20:57.453 "strip_size_kb": 64, 00:20:57.453 "state": "configuring", 00:20:57.453 "raid_level": "concat", 00:20:57.453 "superblock": true, 00:20:57.453 "num_base_bdevs": 4, 00:20:57.453 "num_base_bdevs_discovered": 2, 00:20:57.453 "num_base_bdevs_operational": 4, 00:20:57.453 "base_bdevs_list": [ 00:20:57.453 { 00:20:57.453 "name": null, 00:20:57.453 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:20:57.453 "is_configured": false, 00:20:57.453 "data_offset": 2048, 00:20:57.453 "data_size": 63488 00:20:57.453 }, 00:20:57.453 { 00:20:57.453 "name": null, 00:20:57.453 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:57.453 "is_configured": false, 00:20:57.453 "data_offset": 2048, 00:20:57.453 "data_size": 63488 00:20:57.453 }, 00:20:57.453 { 00:20:57.453 "name": "BaseBdev3", 00:20:57.453 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:57.453 "is_configured": true, 00:20:57.453 "data_offset": 2048, 00:20:57.453 "data_size": 63488 00:20:57.453 }, 00:20:57.453 { 00:20:57.453 "name": "BaseBdev4", 00:20:57.453 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:57.453 "is_configured": true, 00:20:57.453 "data_offset": 2048, 00:20:57.453 "data_size": 63488 00:20:57.453 } 00:20:57.453 ] 00:20:57.453 }' 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.453 10:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:58.017 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.017 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:58.274 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:58.274 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:58.531 [2024-07-15 10:28:35.615712] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.531 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.532 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.788 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.788 "name": "Existed_Raid", 00:20:58.788 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:20:58.788 "strip_size_kb": 64, 00:20:58.788 "state": "configuring", 00:20:58.788 "raid_level": "concat", 00:20:58.788 "superblock": true, 00:20:58.788 "num_base_bdevs": 4, 00:20:58.788 "num_base_bdevs_discovered": 3, 00:20:58.788 "num_base_bdevs_operational": 4, 00:20:58.788 "base_bdevs_list": [ 00:20:58.788 { 00:20:58.788 "name": null, 00:20:58.788 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:20:58.788 "is_configured": false, 00:20:58.788 "data_offset": 2048, 00:20:58.788 "data_size": 63488 00:20:58.788 }, 00:20:58.788 { 00:20:58.788 "name": "BaseBdev2", 00:20:58.788 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:20:58.788 "is_configured": true, 00:20:58.788 "data_offset": 2048, 00:20:58.788 "data_size": 63488 00:20:58.788 }, 00:20:58.788 { 00:20:58.788 "name": "BaseBdev3", 00:20:58.788 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:20:58.788 "is_configured": true, 00:20:58.788 "data_offset": 2048, 00:20:58.788 "data_size": 63488 00:20:58.788 }, 00:20:58.788 { 00:20:58.788 "name": "BaseBdev4", 00:20:58.788 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:20:58.788 "is_configured": true, 00:20:58.788 "data_offset": 2048, 00:20:58.788 "data_size": 63488 00:20:58.788 } 00:20:58.788 ] 00:20:58.788 }' 00:20:58.788 10:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.788 10:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:59.351 10:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.351 10:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:59.608 10:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:59.608 10:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.608 10:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:59.865 10:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 53a3812b-836d-488c-a9b4-6c283de1cb05 00:21:00.122 [2024-07-15 10:28:37.092199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:00.122 [2024-07-15 10:28:37.092368] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bbe850 00:21:00.122 [2024-07-15 10:28:37.092382] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:00.122 [2024-07-15 10:28:37.092562] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb4d80 00:21:00.122 [2024-07-15 10:28:37.092683] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bbe850 00:21:00.122 [2024-07-15 10:28:37.092693] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bbe850 00:21:00.122 [2024-07-15 10:28:37.092787] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:00.122 NewBaseBdev 00:21:00.122 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:00.122 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:00.122 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:00.122 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:00.122 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:00.122 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:00.122 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:00.380 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:00.380 [ 00:21:00.380 { 00:21:00.380 "name": "NewBaseBdev", 00:21:00.380 "aliases": [ 00:21:00.380 "53a3812b-836d-488c-a9b4-6c283de1cb05" 00:21:00.380 ], 00:21:00.380 "product_name": "Malloc disk", 00:21:00.380 "block_size": 512, 00:21:00.380 "num_blocks": 65536, 00:21:00.380 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:21:00.380 "assigned_rate_limits": { 00:21:00.380 "rw_ios_per_sec": 0, 00:21:00.380 "rw_mbytes_per_sec": 0, 00:21:00.380 "r_mbytes_per_sec": 0, 00:21:00.380 "w_mbytes_per_sec": 0 00:21:00.380 }, 00:21:00.380 "claimed": true, 00:21:00.380 "claim_type": "exclusive_write", 00:21:00.380 "zoned": false, 00:21:00.380 "supported_io_types": { 00:21:00.380 "read": true, 00:21:00.380 "write": true, 00:21:00.380 "unmap": true, 00:21:00.380 "flush": true, 00:21:00.380 "reset": true, 00:21:00.380 "nvme_admin": false, 00:21:00.380 "nvme_io": false, 00:21:00.380 "nvme_io_md": false, 00:21:00.380 "write_zeroes": true, 00:21:00.380 "zcopy": true, 00:21:00.380 "get_zone_info": false, 00:21:00.380 "zone_management": false, 00:21:00.380 "zone_append": false, 00:21:00.380 "compare": false, 00:21:00.380 "compare_and_write": false, 00:21:00.380 "abort": true, 00:21:00.380 "seek_hole": false, 00:21:00.380 "seek_data": false, 00:21:00.380 "copy": true, 00:21:00.380 "nvme_iov_md": false 00:21:00.380 }, 00:21:00.380 "memory_domains": [ 00:21:00.380 { 00:21:00.380 "dma_device_id": "system", 00:21:00.380 "dma_device_type": 1 00:21:00.380 }, 00:21:00.380 { 00:21:00.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.380 "dma_device_type": 2 00:21:00.380 } 00:21:00.380 ], 00:21:00.380 "driver_specific": {} 00:21:00.380 } 00:21:00.380 ] 00:21:00.638 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:00.638 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.639 "name": "Existed_Raid", 00:21:00.639 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:21:00.639 "strip_size_kb": 64, 00:21:00.639 "state": "online", 00:21:00.639 "raid_level": "concat", 00:21:00.639 "superblock": true, 00:21:00.639 "num_base_bdevs": 4, 00:21:00.639 "num_base_bdevs_discovered": 4, 00:21:00.639 "num_base_bdevs_operational": 4, 00:21:00.639 "base_bdevs_list": [ 00:21:00.639 { 00:21:00.639 "name": "NewBaseBdev", 00:21:00.639 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:21:00.639 "is_configured": true, 00:21:00.639 "data_offset": 2048, 00:21:00.639 "data_size": 63488 00:21:00.639 }, 00:21:00.639 { 00:21:00.639 "name": "BaseBdev2", 00:21:00.639 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:21:00.639 "is_configured": true, 00:21:00.639 "data_offset": 2048, 00:21:00.639 "data_size": 63488 00:21:00.639 }, 00:21:00.639 { 00:21:00.639 "name": "BaseBdev3", 00:21:00.639 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:21:00.639 "is_configured": true, 00:21:00.639 "data_offset": 2048, 00:21:00.639 "data_size": 63488 00:21:00.639 }, 00:21:00.639 { 00:21:00.639 "name": "BaseBdev4", 00:21:00.639 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:21:00.639 "is_configured": true, 00:21:00.639 "data_offset": 2048, 00:21:00.639 "data_size": 63488 00:21:00.639 } 00:21:00.639 ] 00:21:00.639 }' 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.639 10:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:01.572 [2024-07-15 10:28:38.656675] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:01.572 "name": "Existed_Raid", 00:21:01.572 "aliases": [ 00:21:01.572 "354bc040-fd8c-4c90-b9e1-a39c98d5684e" 00:21:01.572 ], 00:21:01.572 "product_name": "Raid Volume", 00:21:01.572 "block_size": 512, 00:21:01.572 "num_blocks": 253952, 00:21:01.572 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:21:01.572 "assigned_rate_limits": { 00:21:01.572 "rw_ios_per_sec": 0, 00:21:01.572 "rw_mbytes_per_sec": 0, 00:21:01.572 "r_mbytes_per_sec": 0, 00:21:01.572 "w_mbytes_per_sec": 0 00:21:01.572 }, 00:21:01.572 "claimed": false, 00:21:01.572 "zoned": false, 00:21:01.572 "supported_io_types": { 00:21:01.572 "read": true, 00:21:01.572 "write": true, 00:21:01.572 "unmap": true, 00:21:01.572 "flush": true, 00:21:01.572 "reset": true, 00:21:01.572 "nvme_admin": false, 00:21:01.572 "nvme_io": false, 00:21:01.572 "nvme_io_md": false, 00:21:01.572 "write_zeroes": true, 00:21:01.572 "zcopy": false, 00:21:01.572 "get_zone_info": false, 00:21:01.572 "zone_management": false, 00:21:01.572 "zone_append": false, 00:21:01.572 "compare": false, 00:21:01.572 "compare_and_write": false, 00:21:01.572 "abort": false, 00:21:01.572 "seek_hole": false, 00:21:01.572 "seek_data": false, 00:21:01.572 "copy": false, 00:21:01.572 "nvme_iov_md": false 00:21:01.572 }, 00:21:01.572 "memory_domains": [ 00:21:01.572 { 00:21:01.572 "dma_device_id": "system", 00:21:01.572 "dma_device_type": 1 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.572 "dma_device_type": 2 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "dma_device_id": "system", 00:21:01.572 "dma_device_type": 1 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.572 "dma_device_type": 2 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "dma_device_id": "system", 00:21:01.572 "dma_device_type": 1 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.572 "dma_device_type": 2 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "dma_device_id": "system", 00:21:01.572 "dma_device_type": 1 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.572 "dma_device_type": 2 00:21:01.572 } 00:21:01.572 ], 00:21:01.572 "driver_specific": { 00:21:01.572 "raid": { 00:21:01.572 "uuid": "354bc040-fd8c-4c90-b9e1-a39c98d5684e", 00:21:01.572 "strip_size_kb": 64, 00:21:01.572 "state": "online", 00:21:01.572 "raid_level": "concat", 00:21:01.572 "superblock": true, 00:21:01.572 "num_base_bdevs": 4, 00:21:01.572 "num_base_bdevs_discovered": 4, 00:21:01.572 "num_base_bdevs_operational": 4, 00:21:01.572 "base_bdevs_list": [ 00:21:01.572 { 00:21:01.572 "name": "NewBaseBdev", 00:21:01.572 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:21:01.572 "is_configured": true, 00:21:01.572 "data_offset": 2048, 00:21:01.572 "data_size": 63488 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "name": "BaseBdev2", 00:21:01.572 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:21:01.572 "is_configured": true, 00:21:01.572 "data_offset": 2048, 00:21:01.572 "data_size": 63488 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "name": "BaseBdev3", 00:21:01.572 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:21:01.572 "is_configured": true, 00:21:01.572 "data_offset": 2048, 00:21:01.572 "data_size": 63488 00:21:01.572 }, 00:21:01.572 { 00:21:01.572 "name": "BaseBdev4", 00:21:01.572 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:21:01.572 "is_configured": true, 00:21:01.572 "data_offset": 2048, 00:21:01.572 "data_size": 63488 00:21:01.572 } 00:21:01.572 ] 00:21:01.572 } 00:21:01.572 } 00:21:01.572 }' 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:01.572 BaseBdev2 00:21:01.572 BaseBdev3 00:21:01.572 BaseBdev4' 00:21:01.572 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.573 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.573 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:01.830 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.830 "name": "NewBaseBdev", 00:21:01.830 "aliases": [ 00:21:01.830 "53a3812b-836d-488c-a9b4-6c283de1cb05" 00:21:01.830 ], 00:21:01.830 "product_name": "Malloc disk", 00:21:01.830 "block_size": 512, 00:21:01.830 "num_blocks": 65536, 00:21:01.830 "uuid": "53a3812b-836d-488c-a9b4-6c283de1cb05", 00:21:01.830 "assigned_rate_limits": { 00:21:01.830 "rw_ios_per_sec": 0, 00:21:01.830 "rw_mbytes_per_sec": 0, 00:21:01.830 "r_mbytes_per_sec": 0, 00:21:01.830 "w_mbytes_per_sec": 0 00:21:01.830 }, 00:21:01.830 "claimed": true, 00:21:01.830 "claim_type": "exclusive_write", 00:21:01.830 "zoned": false, 00:21:01.830 "supported_io_types": { 00:21:01.830 "read": true, 00:21:01.830 "write": true, 00:21:01.830 "unmap": true, 00:21:01.830 "flush": true, 00:21:01.830 "reset": true, 00:21:01.830 "nvme_admin": false, 00:21:01.830 "nvme_io": false, 00:21:01.830 "nvme_io_md": false, 00:21:01.830 "write_zeroes": true, 00:21:01.830 "zcopy": true, 00:21:01.830 "get_zone_info": false, 00:21:01.830 "zone_management": false, 00:21:01.830 "zone_append": false, 00:21:01.830 "compare": false, 00:21:01.830 "compare_and_write": false, 00:21:01.830 "abort": true, 00:21:01.830 "seek_hole": false, 00:21:01.830 "seek_data": false, 00:21:01.830 "copy": true, 00:21:01.830 "nvme_iov_md": false 00:21:01.830 }, 00:21:01.830 "memory_domains": [ 00:21:01.830 { 00:21:01.830 "dma_device_id": "system", 00:21:01.830 "dma_device_type": 1 00:21:01.830 }, 00:21:01.830 { 00:21:01.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.830 "dma_device_type": 2 00:21:01.830 } 00:21:01.830 ], 00:21:01.830 "driver_specific": {} 00:21:01.830 }' 00:21:01.830 10:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.830 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.087 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.346 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.346 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.346 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:02.346 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.346 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.346 "name": "BaseBdev2", 00:21:02.346 "aliases": [ 00:21:02.346 "c6d575b3-c9eb-4e52-bc44-669a33be3513" 00:21:02.346 ], 00:21:02.346 "product_name": "Malloc disk", 00:21:02.346 "block_size": 512, 00:21:02.346 "num_blocks": 65536, 00:21:02.346 "uuid": "c6d575b3-c9eb-4e52-bc44-669a33be3513", 00:21:02.346 "assigned_rate_limits": { 00:21:02.346 "rw_ios_per_sec": 0, 00:21:02.346 "rw_mbytes_per_sec": 0, 00:21:02.346 "r_mbytes_per_sec": 0, 00:21:02.346 "w_mbytes_per_sec": 0 00:21:02.346 }, 00:21:02.346 "claimed": true, 00:21:02.346 "claim_type": "exclusive_write", 00:21:02.346 "zoned": false, 00:21:02.346 "supported_io_types": { 00:21:02.346 "read": true, 00:21:02.346 "write": true, 00:21:02.346 "unmap": true, 00:21:02.346 "flush": true, 00:21:02.346 "reset": true, 00:21:02.346 "nvme_admin": false, 00:21:02.346 "nvme_io": false, 00:21:02.346 "nvme_io_md": false, 00:21:02.346 "write_zeroes": true, 00:21:02.346 "zcopy": true, 00:21:02.346 "get_zone_info": false, 00:21:02.346 "zone_management": false, 00:21:02.346 "zone_append": false, 00:21:02.346 "compare": false, 00:21:02.346 "compare_and_write": false, 00:21:02.346 "abort": true, 00:21:02.346 "seek_hole": false, 00:21:02.346 "seek_data": false, 00:21:02.346 "copy": true, 00:21:02.346 "nvme_iov_md": false 00:21:02.346 }, 00:21:02.346 "memory_domains": [ 00:21:02.346 { 00:21:02.346 "dma_device_id": "system", 00:21:02.346 "dma_device_type": 1 00:21:02.346 }, 00:21:02.346 { 00:21:02.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.346 "dma_device_type": 2 00:21:02.346 } 00:21:02.346 ], 00:21:02.346 "driver_specific": {} 00:21:02.346 }' 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.605 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.863 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.863 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.863 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.863 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.863 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:02.863 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:02.863 10:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.121 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.121 "name": "BaseBdev3", 00:21:03.121 "aliases": [ 00:21:03.121 "e1fc817d-1d8e-4608-9922-bb1a4d3ac699" 00:21:03.121 ], 00:21:03.121 "product_name": "Malloc disk", 00:21:03.121 "block_size": 512, 00:21:03.121 "num_blocks": 65536, 00:21:03.121 "uuid": "e1fc817d-1d8e-4608-9922-bb1a4d3ac699", 00:21:03.121 "assigned_rate_limits": { 00:21:03.121 "rw_ios_per_sec": 0, 00:21:03.121 "rw_mbytes_per_sec": 0, 00:21:03.121 "r_mbytes_per_sec": 0, 00:21:03.121 "w_mbytes_per_sec": 0 00:21:03.121 }, 00:21:03.121 "claimed": true, 00:21:03.121 "claim_type": "exclusive_write", 00:21:03.121 "zoned": false, 00:21:03.121 "supported_io_types": { 00:21:03.121 "read": true, 00:21:03.121 "write": true, 00:21:03.121 "unmap": true, 00:21:03.121 "flush": true, 00:21:03.121 "reset": true, 00:21:03.121 "nvme_admin": false, 00:21:03.121 "nvme_io": false, 00:21:03.121 "nvme_io_md": false, 00:21:03.121 "write_zeroes": true, 00:21:03.121 "zcopy": true, 00:21:03.121 "get_zone_info": false, 00:21:03.121 "zone_management": false, 00:21:03.121 "zone_append": false, 00:21:03.121 "compare": false, 00:21:03.121 "compare_and_write": false, 00:21:03.121 "abort": true, 00:21:03.121 "seek_hole": false, 00:21:03.121 "seek_data": false, 00:21:03.121 "copy": true, 00:21:03.121 "nvme_iov_md": false 00:21:03.121 }, 00:21:03.121 "memory_domains": [ 00:21:03.121 { 00:21:03.121 "dma_device_id": "system", 00:21:03.121 "dma_device_type": 1 00:21:03.121 }, 00:21:03.121 { 00:21:03.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.121 "dma_device_type": 2 00:21:03.121 } 00:21:03.121 ], 00:21:03.121 "driver_specific": {} 00:21:03.121 }' 00:21:03.121 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.121 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.121 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.121 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.121 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:03.378 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.636 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.636 "name": "BaseBdev4", 00:21:03.636 "aliases": [ 00:21:03.636 "5a8697d1-0257-4b73-bd18-fdacc90d04d6" 00:21:03.636 ], 00:21:03.636 "product_name": "Malloc disk", 00:21:03.636 "block_size": 512, 00:21:03.636 "num_blocks": 65536, 00:21:03.636 "uuid": "5a8697d1-0257-4b73-bd18-fdacc90d04d6", 00:21:03.636 "assigned_rate_limits": { 00:21:03.636 "rw_ios_per_sec": 0, 00:21:03.636 "rw_mbytes_per_sec": 0, 00:21:03.636 "r_mbytes_per_sec": 0, 00:21:03.636 "w_mbytes_per_sec": 0 00:21:03.636 }, 00:21:03.636 "claimed": true, 00:21:03.636 "claim_type": "exclusive_write", 00:21:03.636 "zoned": false, 00:21:03.636 "supported_io_types": { 00:21:03.636 "read": true, 00:21:03.636 "write": true, 00:21:03.636 "unmap": true, 00:21:03.636 "flush": true, 00:21:03.636 "reset": true, 00:21:03.636 "nvme_admin": false, 00:21:03.636 "nvme_io": false, 00:21:03.636 "nvme_io_md": false, 00:21:03.636 "write_zeroes": true, 00:21:03.636 "zcopy": true, 00:21:03.636 "get_zone_info": false, 00:21:03.636 "zone_management": false, 00:21:03.636 "zone_append": false, 00:21:03.636 "compare": false, 00:21:03.636 "compare_and_write": false, 00:21:03.636 "abort": true, 00:21:03.636 "seek_hole": false, 00:21:03.636 "seek_data": false, 00:21:03.636 "copy": true, 00:21:03.636 "nvme_iov_md": false 00:21:03.636 }, 00:21:03.636 "memory_domains": [ 00:21:03.636 { 00:21:03.636 "dma_device_id": "system", 00:21:03.636 "dma_device_type": 1 00:21:03.636 }, 00:21:03.636 { 00:21:03.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.636 "dma_device_type": 2 00:21:03.636 } 00:21:03.636 ], 00:21:03.636 "driver_specific": {} 00:21:03.636 }' 00:21:03.636 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.636 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.636 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.893 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.893 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.893 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.893 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.893 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.893 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:03.893 10:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.893 10:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.893 10:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.893 10:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:04.150 [2024-07-15 10:28:41.299374] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:04.150 [2024-07-15 10:28:41.299404] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:04.150 [2024-07-15 10:28:41.299460] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:04.150 [2024-07-15 10:28:41.299524] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:04.150 [2024-07-15 10:28:41.299536] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bbe850 name Existed_Raid, state offline 00:21:04.150 10:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 555070 00:21:04.150 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 555070 ']' 00:21:04.150 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 555070 00:21:04.150 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:04.150 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:04.150 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 555070 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 555070' 00:21:04.408 killing process with pid 555070 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 555070 00:21:04.408 [2024-07-15 10:28:41.363395] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 555070 00:21:04.408 [2024-07-15 10:28:41.399969] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:04.408 00:21:04.408 real 0m31.142s 00:21:04.408 user 0m57.083s 00:21:04.408 sys 0m5.722s 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:04.408 10:28:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:04.408 ************************************ 00:21:04.408 END TEST raid_state_function_test_sb 00:21:04.408 ************************************ 00:21:04.665 10:28:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:04.665 10:28:41 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:21:04.665 10:28:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:04.665 10:28:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:04.665 10:28:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:04.665 ************************************ 00:21:04.665 START TEST raid_superblock_test 00:21:04.665 ************************************ 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:04.665 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=559780 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 559780 /var/tmp/spdk-raid.sock 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 559780 ']' 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:04.666 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:04.666 10:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:04.666 [2024-07-15 10:28:41.745814] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:21:04.666 [2024-07-15 10:28:41.745880] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid559780 ] 00:21:04.923 [2024-07-15 10:28:41.873059] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:04.923 [2024-07-15 10:28:41.975698] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:04.923 [2024-07-15 10:28:42.037845] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:04.923 [2024-07-15 10:28:42.037885] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:05.489 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:05.747 malloc1 00:21:05.747 10:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:06.004 [2024-07-15 10:28:43.148897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:06.004 [2024-07-15 10:28:43.148951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.004 [2024-07-15 10:28:43.148974] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2689570 00:21:06.004 [2024-07-15 10:28:43.148987] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.004 [2024-07-15 10:28:43.150676] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.004 [2024-07-15 10:28:43.150705] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:06.004 pt1 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:06.004 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:06.261 malloc2 00:21:06.261 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:06.542 [2024-07-15 10:28:43.647012] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:06.542 [2024-07-15 10:28:43.647059] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:06.542 [2024-07-15 10:28:43.647077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x268a970 00:21:06.542 [2024-07-15 10:28:43.647090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:06.542 [2024-07-15 10:28:43.648720] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:06.542 [2024-07-15 10:28:43.648750] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:06.542 pt2 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:06.542 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:06.799 malloc3 00:21:06.799 10:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:07.057 [2024-07-15 10:28:44.149504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:07.057 [2024-07-15 10:28:44.149553] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.057 [2024-07-15 10:28:44.149571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2821340 00:21:07.057 [2024-07-15 10:28:44.149584] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.057 [2024-07-15 10:28:44.151185] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.057 [2024-07-15 10:28:44.151216] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:07.057 pt3 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:07.057 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:07.315 malloc4 00:21:07.315 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:07.573 [2024-07-15 10:28:44.644670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:07.573 [2024-07-15 10:28:44.644718] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.573 [2024-07-15 10:28:44.644741] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2823c60 00:21:07.573 [2024-07-15 10:28:44.644754] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.573 [2024-07-15 10:28:44.646354] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.573 [2024-07-15 10:28:44.646383] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:07.573 pt4 00:21:07.573 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:07.573 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:07.573 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:07.831 [2024-07-15 10:28:44.889357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:07.831 [2024-07-15 10:28:44.890749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:07.831 [2024-07-15 10:28:44.890805] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:07.831 [2024-07-15 10:28:44.890849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:07.831 [2024-07-15 10:28:44.891038] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2681530 00:21:07.831 [2024-07-15 10:28:44.891051] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:07.831 [2024-07-15 10:28:44.891261] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267f770 00:21:07.831 [2024-07-15 10:28:44.891416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2681530 00:21:07.831 [2024-07-15 10:28:44.891427] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2681530 00:21:07.831 [2024-07-15 10:28:44.891531] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.831 10:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:08.090 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.090 "name": "raid_bdev1", 00:21:08.090 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:08.090 "strip_size_kb": 64, 00:21:08.090 "state": "online", 00:21:08.090 "raid_level": "concat", 00:21:08.090 "superblock": true, 00:21:08.090 "num_base_bdevs": 4, 00:21:08.090 "num_base_bdevs_discovered": 4, 00:21:08.090 "num_base_bdevs_operational": 4, 00:21:08.090 "base_bdevs_list": [ 00:21:08.090 { 00:21:08.090 "name": "pt1", 00:21:08.090 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:08.090 "is_configured": true, 00:21:08.090 "data_offset": 2048, 00:21:08.090 "data_size": 63488 00:21:08.090 }, 00:21:08.090 { 00:21:08.090 "name": "pt2", 00:21:08.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:08.090 "is_configured": true, 00:21:08.090 "data_offset": 2048, 00:21:08.090 "data_size": 63488 00:21:08.090 }, 00:21:08.090 { 00:21:08.090 "name": "pt3", 00:21:08.090 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:08.090 "is_configured": true, 00:21:08.090 "data_offset": 2048, 00:21:08.090 "data_size": 63488 00:21:08.090 }, 00:21:08.090 { 00:21:08.090 "name": "pt4", 00:21:08.090 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:08.090 "is_configured": true, 00:21:08.090 "data_offset": 2048, 00:21:08.090 "data_size": 63488 00:21:08.090 } 00:21:08.090 ] 00:21:08.090 }' 00:21:08.090 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.090 10:28:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:08.657 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:08.915 [2024-07-15 10:28:45.860213] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:08.915 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:08.915 "name": "raid_bdev1", 00:21:08.915 "aliases": [ 00:21:08.915 "4d54acd8-82f4-4233-82e1-de41afbd741c" 00:21:08.915 ], 00:21:08.915 "product_name": "Raid Volume", 00:21:08.915 "block_size": 512, 00:21:08.915 "num_blocks": 253952, 00:21:08.915 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:08.915 "assigned_rate_limits": { 00:21:08.915 "rw_ios_per_sec": 0, 00:21:08.915 "rw_mbytes_per_sec": 0, 00:21:08.915 "r_mbytes_per_sec": 0, 00:21:08.916 "w_mbytes_per_sec": 0 00:21:08.916 }, 00:21:08.916 "claimed": false, 00:21:08.916 "zoned": false, 00:21:08.916 "supported_io_types": { 00:21:08.916 "read": true, 00:21:08.916 "write": true, 00:21:08.916 "unmap": true, 00:21:08.916 "flush": true, 00:21:08.916 "reset": true, 00:21:08.916 "nvme_admin": false, 00:21:08.916 "nvme_io": false, 00:21:08.916 "nvme_io_md": false, 00:21:08.916 "write_zeroes": true, 00:21:08.916 "zcopy": false, 00:21:08.916 "get_zone_info": false, 00:21:08.916 "zone_management": false, 00:21:08.916 "zone_append": false, 00:21:08.916 "compare": false, 00:21:08.916 "compare_and_write": false, 00:21:08.916 "abort": false, 00:21:08.916 "seek_hole": false, 00:21:08.916 "seek_data": false, 00:21:08.916 "copy": false, 00:21:08.916 "nvme_iov_md": false 00:21:08.916 }, 00:21:08.916 "memory_domains": [ 00:21:08.916 { 00:21:08.916 "dma_device_id": "system", 00:21:08.916 "dma_device_type": 1 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.916 "dma_device_type": 2 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "dma_device_id": "system", 00:21:08.916 "dma_device_type": 1 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.916 "dma_device_type": 2 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "dma_device_id": "system", 00:21:08.916 "dma_device_type": 1 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.916 "dma_device_type": 2 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "dma_device_id": "system", 00:21:08.916 "dma_device_type": 1 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:08.916 "dma_device_type": 2 00:21:08.916 } 00:21:08.916 ], 00:21:08.916 "driver_specific": { 00:21:08.916 "raid": { 00:21:08.916 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:08.916 "strip_size_kb": 64, 00:21:08.916 "state": "online", 00:21:08.916 "raid_level": "concat", 00:21:08.916 "superblock": true, 00:21:08.916 "num_base_bdevs": 4, 00:21:08.916 "num_base_bdevs_discovered": 4, 00:21:08.916 "num_base_bdevs_operational": 4, 00:21:08.916 "base_bdevs_list": [ 00:21:08.916 { 00:21:08.916 "name": "pt1", 00:21:08.916 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:08.916 "is_configured": true, 00:21:08.916 "data_offset": 2048, 00:21:08.916 "data_size": 63488 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "name": "pt2", 00:21:08.916 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:08.916 "is_configured": true, 00:21:08.916 "data_offset": 2048, 00:21:08.916 "data_size": 63488 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "name": "pt3", 00:21:08.916 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:08.916 "is_configured": true, 00:21:08.916 "data_offset": 2048, 00:21:08.916 "data_size": 63488 00:21:08.916 }, 00:21:08.916 { 00:21:08.916 "name": "pt4", 00:21:08.916 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:08.916 "is_configured": true, 00:21:08.916 "data_offset": 2048, 00:21:08.916 "data_size": 63488 00:21:08.916 } 00:21:08.916 ] 00:21:08.916 } 00:21:08.916 } 00:21:08.916 }' 00:21:08.916 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:08.916 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:08.916 pt2 00:21:08.916 pt3 00:21:08.916 pt4' 00:21:08.916 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:08.916 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:08.916 10:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:09.175 "name": "pt1", 00:21:09.175 "aliases": [ 00:21:09.175 "00000000-0000-0000-0000-000000000001" 00:21:09.175 ], 00:21:09.175 "product_name": "passthru", 00:21:09.175 "block_size": 512, 00:21:09.175 "num_blocks": 65536, 00:21:09.175 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:09.175 "assigned_rate_limits": { 00:21:09.175 "rw_ios_per_sec": 0, 00:21:09.175 "rw_mbytes_per_sec": 0, 00:21:09.175 "r_mbytes_per_sec": 0, 00:21:09.175 "w_mbytes_per_sec": 0 00:21:09.175 }, 00:21:09.175 "claimed": true, 00:21:09.175 "claim_type": "exclusive_write", 00:21:09.175 "zoned": false, 00:21:09.175 "supported_io_types": { 00:21:09.175 "read": true, 00:21:09.175 "write": true, 00:21:09.175 "unmap": true, 00:21:09.175 "flush": true, 00:21:09.175 "reset": true, 00:21:09.175 "nvme_admin": false, 00:21:09.175 "nvme_io": false, 00:21:09.175 "nvme_io_md": false, 00:21:09.175 "write_zeroes": true, 00:21:09.175 "zcopy": true, 00:21:09.175 "get_zone_info": false, 00:21:09.175 "zone_management": false, 00:21:09.175 "zone_append": false, 00:21:09.175 "compare": false, 00:21:09.175 "compare_and_write": false, 00:21:09.175 "abort": true, 00:21:09.175 "seek_hole": false, 00:21:09.175 "seek_data": false, 00:21:09.175 "copy": true, 00:21:09.175 "nvme_iov_md": false 00:21:09.175 }, 00:21:09.175 "memory_domains": [ 00:21:09.175 { 00:21:09.175 "dma_device_id": "system", 00:21:09.175 "dma_device_type": 1 00:21:09.175 }, 00:21:09.175 { 00:21:09.175 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.175 "dma_device_type": 2 00:21:09.175 } 00:21:09.175 ], 00:21:09.175 "driver_specific": { 00:21:09.175 "passthru": { 00:21:09.175 "name": "pt1", 00:21:09.175 "base_bdev_name": "malloc1" 00:21:09.175 } 00:21:09.175 } 00:21:09.175 }' 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:09.175 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:09.433 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:09.691 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:09.691 "name": "pt2", 00:21:09.691 "aliases": [ 00:21:09.691 "00000000-0000-0000-0000-000000000002" 00:21:09.691 ], 00:21:09.691 "product_name": "passthru", 00:21:09.691 "block_size": 512, 00:21:09.691 "num_blocks": 65536, 00:21:09.691 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:09.691 "assigned_rate_limits": { 00:21:09.691 "rw_ios_per_sec": 0, 00:21:09.691 "rw_mbytes_per_sec": 0, 00:21:09.691 "r_mbytes_per_sec": 0, 00:21:09.691 "w_mbytes_per_sec": 0 00:21:09.691 }, 00:21:09.691 "claimed": true, 00:21:09.691 "claim_type": "exclusive_write", 00:21:09.691 "zoned": false, 00:21:09.691 "supported_io_types": { 00:21:09.691 "read": true, 00:21:09.691 "write": true, 00:21:09.691 "unmap": true, 00:21:09.691 "flush": true, 00:21:09.691 "reset": true, 00:21:09.691 "nvme_admin": false, 00:21:09.691 "nvme_io": false, 00:21:09.691 "nvme_io_md": false, 00:21:09.691 "write_zeroes": true, 00:21:09.692 "zcopy": true, 00:21:09.692 "get_zone_info": false, 00:21:09.692 "zone_management": false, 00:21:09.692 "zone_append": false, 00:21:09.692 "compare": false, 00:21:09.692 "compare_and_write": false, 00:21:09.692 "abort": true, 00:21:09.692 "seek_hole": false, 00:21:09.692 "seek_data": false, 00:21:09.692 "copy": true, 00:21:09.692 "nvme_iov_md": false 00:21:09.692 }, 00:21:09.692 "memory_domains": [ 00:21:09.692 { 00:21:09.692 "dma_device_id": "system", 00:21:09.692 "dma_device_type": 1 00:21:09.692 }, 00:21:09.692 { 00:21:09.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.692 "dma_device_type": 2 00:21:09.692 } 00:21:09.692 ], 00:21:09.692 "driver_specific": { 00:21:09.692 "passthru": { 00:21:09.692 "name": "pt2", 00:21:09.692 "base_bdev_name": "malloc2" 00:21:09.692 } 00:21:09.692 } 00:21:09.692 }' 00:21:09.692 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.692 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:09.692 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:09.692 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.950 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:09.950 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:09.950 10:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:09.950 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:10.208 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:10.208 "name": "pt3", 00:21:10.208 "aliases": [ 00:21:10.208 "00000000-0000-0000-0000-000000000003" 00:21:10.208 ], 00:21:10.208 "product_name": "passthru", 00:21:10.208 "block_size": 512, 00:21:10.208 "num_blocks": 65536, 00:21:10.208 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:10.208 "assigned_rate_limits": { 00:21:10.208 "rw_ios_per_sec": 0, 00:21:10.208 "rw_mbytes_per_sec": 0, 00:21:10.208 "r_mbytes_per_sec": 0, 00:21:10.208 "w_mbytes_per_sec": 0 00:21:10.208 }, 00:21:10.208 "claimed": true, 00:21:10.208 "claim_type": "exclusive_write", 00:21:10.208 "zoned": false, 00:21:10.208 "supported_io_types": { 00:21:10.208 "read": true, 00:21:10.208 "write": true, 00:21:10.208 "unmap": true, 00:21:10.208 "flush": true, 00:21:10.208 "reset": true, 00:21:10.208 "nvme_admin": false, 00:21:10.208 "nvme_io": false, 00:21:10.208 "nvme_io_md": false, 00:21:10.208 "write_zeroes": true, 00:21:10.208 "zcopy": true, 00:21:10.208 "get_zone_info": false, 00:21:10.208 "zone_management": false, 00:21:10.208 "zone_append": false, 00:21:10.208 "compare": false, 00:21:10.208 "compare_and_write": false, 00:21:10.208 "abort": true, 00:21:10.208 "seek_hole": false, 00:21:10.208 "seek_data": false, 00:21:10.208 "copy": true, 00:21:10.208 "nvme_iov_md": false 00:21:10.208 }, 00:21:10.208 "memory_domains": [ 00:21:10.208 { 00:21:10.208 "dma_device_id": "system", 00:21:10.208 "dma_device_type": 1 00:21:10.208 }, 00:21:10.208 { 00:21:10.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:10.208 "dma_device_type": 2 00:21:10.208 } 00:21:10.208 ], 00:21:10.208 "driver_specific": { 00:21:10.208 "passthru": { 00:21:10.209 "name": "pt3", 00:21:10.209 "base_bdev_name": "malloc3" 00:21:10.209 } 00:21:10.209 } 00:21:10.209 }' 00:21:10.209 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:10.209 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:10.467 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:10.725 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:10.725 "name": "pt4", 00:21:10.725 "aliases": [ 00:21:10.725 "00000000-0000-0000-0000-000000000004" 00:21:10.725 ], 00:21:10.725 "product_name": "passthru", 00:21:10.725 "block_size": 512, 00:21:10.725 "num_blocks": 65536, 00:21:10.725 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:10.725 "assigned_rate_limits": { 00:21:10.725 "rw_ios_per_sec": 0, 00:21:10.725 "rw_mbytes_per_sec": 0, 00:21:10.725 "r_mbytes_per_sec": 0, 00:21:10.725 "w_mbytes_per_sec": 0 00:21:10.725 }, 00:21:10.725 "claimed": true, 00:21:10.725 "claim_type": "exclusive_write", 00:21:10.725 "zoned": false, 00:21:10.725 "supported_io_types": { 00:21:10.725 "read": true, 00:21:10.725 "write": true, 00:21:10.725 "unmap": true, 00:21:10.725 "flush": true, 00:21:10.725 "reset": true, 00:21:10.725 "nvme_admin": false, 00:21:10.725 "nvme_io": false, 00:21:10.725 "nvme_io_md": false, 00:21:10.725 "write_zeroes": true, 00:21:10.725 "zcopy": true, 00:21:10.725 "get_zone_info": false, 00:21:10.725 "zone_management": false, 00:21:10.725 "zone_append": false, 00:21:10.725 "compare": false, 00:21:10.725 "compare_and_write": false, 00:21:10.725 "abort": true, 00:21:10.725 "seek_hole": false, 00:21:10.725 "seek_data": false, 00:21:10.725 "copy": true, 00:21:10.725 "nvme_iov_md": false 00:21:10.725 }, 00:21:10.725 "memory_domains": [ 00:21:10.725 { 00:21:10.725 "dma_device_id": "system", 00:21:10.725 "dma_device_type": 1 00:21:10.725 }, 00:21:10.725 { 00:21:10.725 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:10.725 "dma_device_type": 2 00:21:10.725 } 00:21:10.725 ], 00:21:10.725 "driver_specific": { 00:21:10.725 "passthru": { 00:21:10.725 "name": "pt4", 00:21:10.725 "base_bdev_name": "malloc4" 00:21:10.725 } 00:21:10.725 } 00:21:10.725 }' 00:21:10.725 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:10.725 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:10.725 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:10.725 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:10.983 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:10.983 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:10.983 10:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:10.983 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:10.983 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:10.983 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:10.983 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:10.983 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:10.983 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:10.983 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:11.241 [2024-07-15 10:28:48.378877] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:11.241 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=4d54acd8-82f4-4233-82e1-de41afbd741c 00:21:11.241 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 4d54acd8-82f4-4233-82e1-de41afbd741c ']' 00:21:11.241 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:11.499 [2024-07-15 10:28:48.619213] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:11.499 [2024-07-15 10:28:48.619234] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:11.499 [2024-07-15 10:28:48.619285] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:11.499 [2024-07-15 10:28:48.619348] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:11.499 [2024-07-15 10:28:48.619360] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2681530 name raid_bdev1, state offline 00:21:11.500 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.500 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:11.757 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:11.757 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:11.757 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:11.757 10:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:12.016 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:12.016 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:12.275 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:12.275 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:12.533 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:12.533 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:12.791 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:12.791 10:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:13.049 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:13.307 [2024-07-15 10:28:50.327650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:13.307 [2024-07-15 10:28:50.329059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:13.307 [2024-07-15 10:28:50.329103] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:13.307 [2024-07-15 10:28:50.329137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:13.307 [2024-07-15 10:28:50.329183] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:13.307 [2024-07-15 10:28:50.329225] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:13.307 [2024-07-15 10:28:50.329248] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:13.307 [2024-07-15 10:28:50.329270] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:13.307 [2024-07-15 10:28:50.329289] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:13.307 [2024-07-15 10:28:50.329299] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x282cff0 name raid_bdev1, state configuring 00:21:13.307 request: 00:21:13.307 { 00:21:13.307 "name": "raid_bdev1", 00:21:13.307 "raid_level": "concat", 00:21:13.307 "base_bdevs": [ 00:21:13.307 "malloc1", 00:21:13.307 "malloc2", 00:21:13.307 "malloc3", 00:21:13.307 "malloc4" 00:21:13.307 ], 00:21:13.307 "strip_size_kb": 64, 00:21:13.307 "superblock": false, 00:21:13.307 "method": "bdev_raid_create", 00:21:13.307 "req_id": 1 00:21:13.307 } 00:21:13.307 Got JSON-RPC error response 00:21:13.307 response: 00:21:13.307 { 00:21:13.307 "code": -17, 00:21:13.307 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:13.307 } 00:21:13.307 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:13.307 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:13.307 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:13.307 10:28:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:13.307 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.307 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:13.566 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:13.566 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:13.566 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:13.824 [2024-07-15 10:28:50.821062] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:13.824 [2024-07-15 10:28:50.821108] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:13.824 [2024-07-15 10:28:50.821130] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26897a0 00:21:13.824 [2024-07-15 10:28:50.821142] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:13.824 [2024-07-15 10:28:50.822740] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:13.824 [2024-07-15 10:28:50.822770] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:13.824 [2024-07-15 10:28:50.822834] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:13.824 [2024-07-15 10:28:50.822860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:13.824 pt1 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.824 10:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.082 10:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.082 "name": "raid_bdev1", 00:21:14.082 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:14.082 "strip_size_kb": 64, 00:21:14.082 "state": "configuring", 00:21:14.082 "raid_level": "concat", 00:21:14.082 "superblock": true, 00:21:14.082 "num_base_bdevs": 4, 00:21:14.082 "num_base_bdevs_discovered": 1, 00:21:14.082 "num_base_bdevs_operational": 4, 00:21:14.082 "base_bdevs_list": [ 00:21:14.082 { 00:21:14.082 "name": "pt1", 00:21:14.082 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:14.082 "is_configured": true, 00:21:14.082 "data_offset": 2048, 00:21:14.082 "data_size": 63488 00:21:14.082 }, 00:21:14.082 { 00:21:14.082 "name": null, 00:21:14.082 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:14.082 "is_configured": false, 00:21:14.082 "data_offset": 2048, 00:21:14.082 "data_size": 63488 00:21:14.082 }, 00:21:14.082 { 00:21:14.082 "name": null, 00:21:14.082 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:14.082 "is_configured": false, 00:21:14.082 "data_offset": 2048, 00:21:14.082 "data_size": 63488 00:21:14.082 }, 00:21:14.082 { 00:21:14.082 "name": null, 00:21:14.082 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:14.082 "is_configured": false, 00:21:14.082 "data_offset": 2048, 00:21:14.082 "data_size": 63488 00:21:14.082 } 00:21:14.082 ] 00:21:14.082 }' 00:21:14.082 10:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.082 10:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.649 10:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:14.649 10:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:14.906 [2024-07-15 10:28:51.895932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:14.907 [2024-07-15 10:28:51.895979] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:14.907 [2024-07-15 10:28:51.895997] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2680ea0 00:21:14.907 [2024-07-15 10:28:51.896010] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:14.907 [2024-07-15 10:28:51.896345] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:14.907 [2024-07-15 10:28:51.896364] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:14.907 [2024-07-15 10:28:51.896421] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:14.907 [2024-07-15 10:28:51.896439] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:14.907 pt2 00:21:14.907 10:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:15.165 [2024-07-15 10:28:52.140578] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.165 "name": "raid_bdev1", 00:21:15.165 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:15.165 "strip_size_kb": 64, 00:21:15.165 "state": "configuring", 00:21:15.165 "raid_level": "concat", 00:21:15.165 "superblock": true, 00:21:15.165 "num_base_bdevs": 4, 00:21:15.165 "num_base_bdevs_discovered": 1, 00:21:15.165 "num_base_bdevs_operational": 4, 00:21:15.165 "base_bdevs_list": [ 00:21:15.165 { 00:21:15.165 "name": "pt1", 00:21:15.165 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:15.165 "is_configured": true, 00:21:15.165 "data_offset": 2048, 00:21:15.165 "data_size": 63488 00:21:15.165 }, 00:21:15.165 { 00:21:15.165 "name": null, 00:21:15.165 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:15.165 "is_configured": false, 00:21:15.165 "data_offset": 2048, 00:21:15.165 "data_size": 63488 00:21:15.165 }, 00:21:15.165 { 00:21:15.165 "name": null, 00:21:15.165 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:15.165 "is_configured": false, 00:21:15.165 "data_offset": 2048, 00:21:15.165 "data_size": 63488 00:21:15.165 }, 00:21:15.165 { 00:21:15.165 "name": null, 00:21:15.165 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:15.165 "is_configured": false, 00:21:15.165 "data_offset": 2048, 00:21:15.165 "data_size": 63488 00:21:15.165 } 00:21:15.165 ] 00:21:15.165 }' 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.165 10:28:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:15.730 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:15.730 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:15.730 10:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:15.988 [2024-07-15 10:28:53.046990] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:15.988 [2024-07-15 10:28:53.047034] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:15.988 [2024-07-15 10:28:53.047053] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x267fec0 00:21:15.988 [2024-07-15 10:28:53.047066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:15.989 [2024-07-15 10:28:53.047393] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:15.989 [2024-07-15 10:28:53.047411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:15.989 [2024-07-15 10:28:53.047471] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:15.989 [2024-07-15 10:28:53.047489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:15.989 pt2 00:21:15.989 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:15.989 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:15.989 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:16.245 [2024-07-15 10:28:53.219449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:16.245 [2024-07-15 10:28:53.219479] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:16.245 [2024-07-15 10:28:53.219495] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26800f0 00:21:16.245 [2024-07-15 10:28:53.219508] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:16.245 [2024-07-15 10:28:53.219797] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:16.245 [2024-07-15 10:28:53.219815] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:16.245 [2024-07-15 10:28:53.219866] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:16.245 [2024-07-15 10:28:53.219883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:16.245 pt3 00:21:16.245 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:16.245 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:16.245 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:16.502 [2024-07-15 10:28:53.468117] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:16.502 [2024-07-15 10:28:53.468148] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:16.502 [2024-07-15 10:28:53.468164] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2688af0 00:21:16.502 [2024-07-15 10:28:53.468176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:16.502 [2024-07-15 10:28:53.468453] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:16.502 [2024-07-15 10:28:53.468471] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:16.502 [2024-07-15 10:28:53.468520] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:16.502 [2024-07-15 10:28:53.468543] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:16.502 [2024-07-15 10:28:53.468659] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26828f0 00:21:16.502 [2024-07-15 10:28:53.468670] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:16.502 [2024-07-15 10:28:53.468836] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2682150 00:21:16.502 [2024-07-15 10:28:53.468973] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26828f0 00:21:16.502 [2024-07-15 10:28:53.468983] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26828f0 00:21:16.502 [2024-07-15 10:28:53.469080] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:16.502 pt4 00:21:16.502 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:21:16.502 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.503 "name": "raid_bdev1", 00:21:16.503 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:16.503 "strip_size_kb": 64, 00:21:16.503 "state": "online", 00:21:16.503 "raid_level": "concat", 00:21:16.503 "superblock": true, 00:21:16.503 "num_base_bdevs": 4, 00:21:16.503 "num_base_bdevs_discovered": 4, 00:21:16.503 "num_base_bdevs_operational": 4, 00:21:16.503 "base_bdevs_list": [ 00:21:16.503 { 00:21:16.503 "name": "pt1", 00:21:16.503 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:16.503 "is_configured": true, 00:21:16.503 "data_offset": 2048, 00:21:16.503 "data_size": 63488 00:21:16.503 }, 00:21:16.503 { 00:21:16.503 "name": "pt2", 00:21:16.503 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:16.503 "is_configured": true, 00:21:16.503 "data_offset": 2048, 00:21:16.503 "data_size": 63488 00:21:16.503 }, 00:21:16.503 { 00:21:16.503 "name": "pt3", 00:21:16.503 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:16.503 "is_configured": true, 00:21:16.503 "data_offset": 2048, 00:21:16.503 "data_size": 63488 00:21:16.503 }, 00:21:16.503 { 00:21:16.503 "name": "pt4", 00:21:16.503 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:16.503 "is_configured": true, 00:21:16.503 "data_offset": 2048, 00:21:16.503 "data_size": 63488 00:21:16.503 } 00:21:16.503 ] 00:21:16.503 }' 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.503 10:28:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:17.434 [2024-07-15 10:28:54.507211] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:17.434 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:17.434 "name": "raid_bdev1", 00:21:17.434 "aliases": [ 00:21:17.434 "4d54acd8-82f4-4233-82e1-de41afbd741c" 00:21:17.434 ], 00:21:17.434 "product_name": "Raid Volume", 00:21:17.434 "block_size": 512, 00:21:17.434 "num_blocks": 253952, 00:21:17.434 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:17.434 "assigned_rate_limits": { 00:21:17.434 "rw_ios_per_sec": 0, 00:21:17.434 "rw_mbytes_per_sec": 0, 00:21:17.434 "r_mbytes_per_sec": 0, 00:21:17.434 "w_mbytes_per_sec": 0 00:21:17.434 }, 00:21:17.434 "claimed": false, 00:21:17.434 "zoned": false, 00:21:17.434 "supported_io_types": { 00:21:17.434 "read": true, 00:21:17.434 "write": true, 00:21:17.434 "unmap": true, 00:21:17.434 "flush": true, 00:21:17.434 "reset": true, 00:21:17.434 "nvme_admin": false, 00:21:17.434 "nvme_io": false, 00:21:17.434 "nvme_io_md": false, 00:21:17.434 "write_zeroes": true, 00:21:17.434 "zcopy": false, 00:21:17.434 "get_zone_info": false, 00:21:17.434 "zone_management": false, 00:21:17.434 "zone_append": false, 00:21:17.434 "compare": false, 00:21:17.434 "compare_and_write": false, 00:21:17.434 "abort": false, 00:21:17.434 "seek_hole": false, 00:21:17.434 "seek_data": false, 00:21:17.434 "copy": false, 00:21:17.434 "nvme_iov_md": false 00:21:17.434 }, 00:21:17.434 "memory_domains": [ 00:21:17.434 { 00:21:17.434 "dma_device_id": "system", 00:21:17.434 "dma_device_type": 1 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.434 "dma_device_type": 2 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "dma_device_id": "system", 00:21:17.434 "dma_device_type": 1 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.434 "dma_device_type": 2 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "dma_device_id": "system", 00:21:17.434 "dma_device_type": 1 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.434 "dma_device_type": 2 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "dma_device_id": "system", 00:21:17.434 "dma_device_type": 1 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.434 "dma_device_type": 2 00:21:17.434 } 00:21:17.434 ], 00:21:17.434 "driver_specific": { 00:21:17.434 "raid": { 00:21:17.434 "uuid": "4d54acd8-82f4-4233-82e1-de41afbd741c", 00:21:17.434 "strip_size_kb": 64, 00:21:17.434 "state": "online", 00:21:17.434 "raid_level": "concat", 00:21:17.434 "superblock": true, 00:21:17.434 "num_base_bdevs": 4, 00:21:17.434 "num_base_bdevs_discovered": 4, 00:21:17.434 "num_base_bdevs_operational": 4, 00:21:17.434 "base_bdevs_list": [ 00:21:17.434 { 00:21:17.434 "name": "pt1", 00:21:17.434 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:17.434 "is_configured": true, 00:21:17.434 "data_offset": 2048, 00:21:17.434 "data_size": 63488 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "name": "pt2", 00:21:17.434 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:17.434 "is_configured": true, 00:21:17.434 "data_offset": 2048, 00:21:17.434 "data_size": 63488 00:21:17.434 }, 00:21:17.434 { 00:21:17.434 "name": "pt3", 00:21:17.434 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:17.434 "is_configured": true, 00:21:17.434 "data_offset": 2048, 00:21:17.434 "data_size": 63488 00:21:17.435 }, 00:21:17.435 { 00:21:17.435 "name": "pt4", 00:21:17.435 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:17.435 "is_configured": true, 00:21:17.435 "data_offset": 2048, 00:21:17.435 "data_size": 63488 00:21:17.435 } 00:21:17.435 ] 00:21:17.435 } 00:21:17.435 } 00:21:17.435 }' 00:21:17.435 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:17.435 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:17.435 pt2 00:21:17.435 pt3 00:21:17.435 pt4' 00:21:17.435 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:17.435 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:17.435 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:17.693 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:17.693 "name": "pt1", 00:21:17.693 "aliases": [ 00:21:17.693 "00000000-0000-0000-0000-000000000001" 00:21:17.693 ], 00:21:17.693 "product_name": "passthru", 00:21:17.693 "block_size": 512, 00:21:17.693 "num_blocks": 65536, 00:21:17.693 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:17.693 "assigned_rate_limits": { 00:21:17.693 "rw_ios_per_sec": 0, 00:21:17.693 "rw_mbytes_per_sec": 0, 00:21:17.693 "r_mbytes_per_sec": 0, 00:21:17.693 "w_mbytes_per_sec": 0 00:21:17.693 }, 00:21:17.693 "claimed": true, 00:21:17.693 "claim_type": "exclusive_write", 00:21:17.693 "zoned": false, 00:21:17.693 "supported_io_types": { 00:21:17.693 "read": true, 00:21:17.693 "write": true, 00:21:17.693 "unmap": true, 00:21:17.693 "flush": true, 00:21:17.693 "reset": true, 00:21:17.693 "nvme_admin": false, 00:21:17.693 "nvme_io": false, 00:21:17.693 "nvme_io_md": false, 00:21:17.693 "write_zeroes": true, 00:21:17.693 "zcopy": true, 00:21:17.693 "get_zone_info": false, 00:21:17.693 "zone_management": false, 00:21:17.693 "zone_append": false, 00:21:17.693 "compare": false, 00:21:17.693 "compare_and_write": false, 00:21:17.693 "abort": true, 00:21:17.693 "seek_hole": false, 00:21:17.693 "seek_data": false, 00:21:17.693 "copy": true, 00:21:17.693 "nvme_iov_md": false 00:21:17.693 }, 00:21:17.693 "memory_domains": [ 00:21:17.693 { 00:21:17.693 "dma_device_id": "system", 00:21:17.693 "dma_device_type": 1 00:21:17.693 }, 00:21:17.693 { 00:21:17.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.693 "dma_device_type": 2 00:21:17.693 } 00:21:17.693 ], 00:21:17.693 "driver_specific": { 00:21:17.693 "passthru": { 00:21:17.693 "name": "pt1", 00:21:17.693 "base_bdev_name": "malloc1" 00:21:17.693 } 00:21:17.693 } 00:21:17.693 }' 00:21:17.693 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:17.693 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:17.951 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:17.951 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:17.951 10:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:17.951 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:17.951 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:17.951 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:17.951 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:17.951 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:17.951 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:18.209 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:18.209 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:18.209 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:18.209 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:18.467 "name": "pt2", 00:21:18.467 "aliases": [ 00:21:18.467 "00000000-0000-0000-0000-000000000002" 00:21:18.467 ], 00:21:18.467 "product_name": "passthru", 00:21:18.467 "block_size": 512, 00:21:18.467 "num_blocks": 65536, 00:21:18.467 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:18.467 "assigned_rate_limits": { 00:21:18.467 "rw_ios_per_sec": 0, 00:21:18.467 "rw_mbytes_per_sec": 0, 00:21:18.467 "r_mbytes_per_sec": 0, 00:21:18.467 "w_mbytes_per_sec": 0 00:21:18.467 }, 00:21:18.467 "claimed": true, 00:21:18.467 "claim_type": "exclusive_write", 00:21:18.467 "zoned": false, 00:21:18.467 "supported_io_types": { 00:21:18.467 "read": true, 00:21:18.467 "write": true, 00:21:18.467 "unmap": true, 00:21:18.467 "flush": true, 00:21:18.467 "reset": true, 00:21:18.467 "nvme_admin": false, 00:21:18.467 "nvme_io": false, 00:21:18.467 "nvme_io_md": false, 00:21:18.467 "write_zeroes": true, 00:21:18.467 "zcopy": true, 00:21:18.467 "get_zone_info": false, 00:21:18.467 "zone_management": false, 00:21:18.467 "zone_append": false, 00:21:18.467 "compare": false, 00:21:18.467 "compare_and_write": false, 00:21:18.467 "abort": true, 00:21:18.467 "seek_hole": false, 00:21:18.467 "seek_data": false, 00:21:18.467 "copy": true, 00:21:18.467 "nvme_iov_md": false 00:21:18.467 }, 00:21:18.467 "memory_domains": [ 00:21:18.467 { 00:21:18.467 "dma_device_id": "system", 00:21:18.467 "dma_device_type": 1 00:21:18.467 }, 00:21:18.467 { 00:21:18.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.467 "dma_device_type": 2 00:21:18.467 } 00:21:18.467 ], 00:21:18.467 "driver_specific": { 00:21:18.467 "passthru": { 00:21:18.467 "name": "pt2", 00:21:18.467 "base_bdev_name": "malloc2" 00:21:18.467 } 00:21:18.467 } 00:21:18.467 }' 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:18.467 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:18.725 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:18.725 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:18.725 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:18.725 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:18.725 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:18.725 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:18.725 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:18.983 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:18.983 "name": "pt3", 00:21:18.983 "aliases": [ 00:21:18.983 "00000000-0000-0000-0000-000000000003" 00:21:18.983 ], 00:21:18.983 "product_name": "passthru", 00:21:18.983 "block_size": 512, 00:21:18.983 "num_blocks": 65536, 00:21:18.984 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:18.984 "assigned_rate_limits": { 00:21:18.984 "rw_ios_per_sec": 0, 00:21:18.984 "rw_mbytes_per_sec": 0, 00:21:18.984 "r_mbytes_per_sec": 0, 00:21:18.984 "w_mbytes_per_sec": 0 00:21:18.984 }, 00:21:18.984 "claimed": true, 00:21:18.984 "claim_type": "exclusive_write", 00:21:18.984 "zoned": false, 00:21:18.984 "supported_io_types": { 00:21:18.984 "read": true, 00:21:18.984 "write": true, 00:21:18.984 "unmap": true, 00:21:18.984 "flush": true, 00:21:18.984 "reset": true, 00:21:18.984 "nvme_admin": false, 00:21:18.984 "nvme_io": false, 00:21:18.984 "nvme_io_md": false, 00:21:18.984 "write_zeroes": true, 00:21:18.984 "zcopy": true, 00:21:18.984 "get_zone_info": false, 00:21:18.984 "zone_management": false, 00:21:18.984 "zone_append": false, 00:21:18.984 "compare": false, 00:21:18.984 "compare_and_write": false, 00:21:18.984 "abort": true, 00:21:18.984 "seek_hole": false, 00:21:18.984 "seek_data": false, 00:21:18.984 "copy": true, 00:21:18.984 "nvme_iov_md": false 00:21:18.984 }, 00:21:18.984 "memory_domains": [ 00:21:18.984 { 00:21:18.984 "dma_device_id": "system", 00:21:18.984 "dma_device_type": 1 00:21:18.984 }, 00:21:18.984 { 00:21:18.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.984 "dma_device_type": 2 00:21:18.984 } 00:21:18.984 ], 00:21:18.984 "driver_specific": { 00:21:18.984 "passthru": { 00:21:18.984 "name": "pt3", 00:21:18.984 "base_bdev_name": "malloc3" 00:21:18.984 } 00:21:18.984 } 00:21:18.984 }' 00:21:18.984 10:28:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.984 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.984 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:18.984 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.984 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.984 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:18.984 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:19.242 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:19.556 "name": "pt4", 00:21:19.556 "aliases": [ 00:21:19.556 "00000000-0000-0000-0000-000000000004" 00:21:19.556 ], 00:21:19.556 "product_name": "passthru", 00:21:19.556 "block_size": 512, 00:21:19.556 "num_blocks": 65536, 00:21:19.556 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:19.556 "assigned_rate_limits": { 00:21:19.556 "rw_ios_per_sec": 0, 00:21:19.556 "rw_mbytes_per_sec": 0, 00:21:19.556 "r_mbytes_per_sec": 0, 00:21:19.556 "w_mbytes_per_sec": 0 00:21:19.556 }, 00:21:19.556 "claimed": true, 00:21:19.556 "claim_type": "exclusive_write", 00:21:19.556 "zoned": false, 00:21:19.556 "supported_io_types": { 00:21:19.556 "read": true, 00:21:19.556 "write": true, 00:21:19.556 "unmap": true, 00:21:19.556 "flush": true, 00:21:19.556 "reset": true, 00:21:19.556 "nvme_admin": false, 00:21:19.556 "nvme_io": false, 00:21:19.556 "nvme_io_md": false, 00:21:19.556 "write_zeroes": true, 00:21:19.556 "zcopy": true, 00:21:19.556 "get_zone_info": false, 00:21:19.556 "zone_management": false, 00:21:19.556 "zone_append": false, 00:21:19.556 "compare": false, 00:21:19.556 "compare_and_write": false, 00:21:19.556 "abort": true, 00:21:19.556 "seek_hole": false, 00:21:19.556 "seek_data": false, 00:21:19.556 "copy": true, 00:21:19.556 "nvme_iov_md": false 00:21:19.556 }, 00:21:19.556 "memory_domains": [ 00:21:19.556 { 00:21:19.556 "dma_device_id": "system", 00:21:19.556 "dma_device_type": 1 00:21:19.556 }, 00:21:19.556 { 00:21:19.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:19.556 "dma_device_type": 2 00:21:19.556 } 00:21:19.556 ], 00:21:19.556 "driver_specific": { 00:21:19.556 "passthru": { 00:21:19.556 "name": "pt4", 00:21:19.556 "base_bdev_name": "malloc4" 00:21:19.556 } 00:21:19.556 } 00:21:19.556 }' 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:19.556 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.813 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.813 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.813 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.813 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.813 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.813 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:19.813 10:28:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:20.071 [2024-07-15 10:28:57.142229] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 4d54acd8-82f4-4233-82e1-de41afbd741c '!=' 4d54acd8-82f4-4233-82e1-de41afbd741c ']' 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 559780 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 559780 ']' 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 559780 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 559780 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 559780' 00:21:20.071 killing process with pid 559780 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 559780 00:21:20.071 [2024-07-15 10:28:57.207982] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:20.071 [2024-07-15 10:28:57.208045] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:20.071 [2024-07-15 10:28:57.208106] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:20.071 [2024-07-15 10:28:57.208123] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26828f0 name raid_bdev1, state offline 00:21:20.071 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 559780 00:21:20.071 [2024-07-15 10:28:57.244799] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:20.329 10:28:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:20.329 00:21:20.329 real 0m15.768s 00:21:20.329 user 0m28.384s 00:21:20.329 sys 0m2.901s 00:21:20.329 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:20.329 10:28:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:20.329 ************************************ 00:21:20.329 END TEST raid_superblock_test 00:21:20.329 ************************************ 00:21:20.329 10:28:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:20.329 10:28:57 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:21:20.329 10:28:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:20.329 10:28:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:20.329 10:28:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:20.587 ************************************ 00:21:20.587 START TEST raid_read_error_test 00:21:20.587 ************************************ 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:20.587 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.80NLDMZkar 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=562209 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 562209 /var/tmp/spdk-raid.sock 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 562209 ']' 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:20.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:20.588 10:28:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:20.588 [2024-07-15 10:28:57.598591] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:21:20.588 [2024-07-15 10:28:57.598657] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid562209 ] 00:21:20.588 [2024-07-15 10:28:57.727922] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:20.846 [2024-07-15 10:28:57.834899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:20.846 [2024-07-15 10:28:57.912334] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:20.846 [2024-07-15 10:28:57.912376] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:21.413 10:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:21.413 10:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:21.413 10:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:21.413 10:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:21.671 BaseBdev1_malloc 00:21:21.671 10:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:21.929 true 00:21:21.929 10:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:22.187 [2024-07-15 10:28:59.235984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:22.187 [2024-07-15 10:28:59.236031] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.187 [2024-07-15 10:28:59.236052] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1daf0d0 00:21:22.187 [2024-07-15 10:28:59.236066] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.187 [2024-07-15 10:28:59.237934] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.187 [2024-07-15 10:28:59.237967] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:22.187 BaseBdev1 00:21:22.187 10:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:22.187 10:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:22.444 BaseBdev2_malloc 00:21:22.444 10:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:22.702 true 00:21:22.702 10:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:22.959 [2024-07-15 10:28:59.951684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:22.959 [2024-07-15 10:28:59.951731] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.959 [2024-07-15 10:28:59.951753] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db3910 00:21:22.959 [2024-07-15 10:28:59.951765] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.959 [2024-07-15 10:28:59.953383] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.959 [2024-07-15 10:28:59.953413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:22.959 BaseBdev2 00:21:22.959 10:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:22.959 10:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:23.215 BaseBdev3_malloc 00:21:23.215 10:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:23.472 true 00:21:23.472 10:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:23.729 [2024-07-15 10:29:00.691507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:23.729 [2024-07-15 10:29:00.691552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:23.729 [2024-07-15 10:29:00.691572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db5bd0 00:21:23.729 [2024-07-15 10:29:00.691586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:23.729 [2024-07-15 10:29:00.693168] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:23.729 [2024-07-15 10:29:00.693198] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:23.729 BaseBdev3 00:21:23.729 10:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:23.729 10:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:23.985 BaseBdev4_malloc 00:21:23.985 10:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:23.985 true 00:21:24.242 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:24.242 [2024-07-15 10:29:01.413969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:24.242 [2024-07-15 10:29:01.414015] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:24.242 [2024-07-15 10:29:01.414037] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db6aa0 00:21:24.242 [2024-07-15 10:29:01.414050] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:24.242 [2024-07-15 10:29:01.415649] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:24.242 [2024-07-15 10:29:01.415680] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:24.242 BaseBdev4 00:21:24.242 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:24.499 [2024-07-15 10:29:01.646616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:24.499 [2024-07-15 10:29:01.647993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:24.499 [2024-07-15 10:29:01.648070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:24.499 [2024-07-15 10:29:01.648131] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:24.499 [2024-07-15 10:29:01.648357] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1db0c20 00:21:24.499 [2024-07-15 10:29:01.648369] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:24.499 [2024-07-15 10:29:01.648569] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c05260 00:21:24.499 [2024-07-15 10:29:01.648719] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1db0c20 00:21:24.499 [2024-07-15 10:29:01.648729] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1db0c20 00:21:24.499 [2024-07-15 10:29:01.648834] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.499 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.757 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:24.757 "name": "raid_bdev1", 00:21:24.757 "uuid": "f4f36c09-8086-4fe3-aa05-994750ac0ca6", 00:21:24.757 "strip_size_kb": 64, 00:21:24.757 "state": "online", 00:21:24.757 "raid_level": "concat", 00:21:24.757 "superblock": true, 00:21:24.757 "num_base_bdevs": 4, 00:21:24.757 "num_base_bdevs_discovered": 4, 00:21:24.757 "num_base_bdevs_operational": 4, 00:21:24.757 "base_bdevs_list": [ 00:21:24.757 { 00:21:24.757 "name": "BaseBdev1", 00:21:24.757 "uuid": "cf48d7ac-ff36-5a70-8630-68e9aaa27cdf", 00:21:24.757 "is_configured": true, 00:21:24.757 "data_offset": 2048, 00:21:24.757 "data_size": 63488 00:21:24.757 }, 00:21:24.757 { 00:21:24.757 "name": "BaseBdev2", 00:21:24.757 "uuid": "278f5dbb-be00-58f1-8700-22f70e5f8740", 00:21:24.757 "is_configured": true, 00:21:24.757 "data_offset": 2048, 00:21:24.757 "data_size": 63488 00:21:24.757 }, 00:21:24.757 { 00:21:24.757 "name": "BaseBdev3", 00:21:24.757 "uuid": "61cb1335-290f-5376-80cc-ebdf199fba3b", 00:21:24.757 "is_configured": true, 00:21:24.757 "data_offset": 2048, 00:21:24.757 "data_size": 63488 00:21:24.757 }, 00:21:24.757 { 00:21:24.757 "name": "BaseBdev4", 00:21:24.757 "uuid": "2e857e3f-eb96-58ed-8229-21a11ce5e385", 00:21:24.757 "is_configured": true, 00:21:24.757 "data_offset": 2048, 00:21:24.757 "data_size": 63488 00:21:24.757 } 00:21:24.757 ] 00:21:24.757 }' 00:21:24.757 10:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:24.757 10:29:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.321 10:29:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:25.321 10:29:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:25.579 [2024-07-15 10:29:02.593381] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1da2fc0 00:21:26.511 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.769 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.027 10:29:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.027 "name": "raid_bdev1", 00:21:27.027 "uuid": "f4f36c09-8086-4fe3-aa05-994750ac0ca6", 00:21:27.027 "strip_size_kb": 64, 00:21:27.027 "state": "online", 00:21:27.027 "raid_level": "concat", 00:21:27.027 "superblock": true, 00:21:27.027 "num_base_bdevs": 4, 00:21:27.027 "num_base_bdevs_discovered": 4, 00:21:27.027 "num_base_bdevs_operational": 4, 00:21:27.027 "base_bdevs_list": [ 00:21:27.027 { 00:21:27.027 "name": "BaseBdev1", 00:21:27.027 "uuid": "cf48d7ac-ff36-5a70-8630-68e9aaa27cdf", 00:21:27.027 "is_configured": true, 00:21:27.027 "data_offset": 2048, 00:21:27.027 "data_size": 63488 00:21:27.027 }, 00:21:27.027 { 00:21:27.027 "name": "BaseBdev2", 00:21:27.027 "uuid": "278f5dbb-be00-58f1-8700-22f70e5f8740", 00:21:27.027 "is_configured": true, 00:21:27.027 "data_offset": 2048, 00:21:27.027 "data_size": 63488 00:21:27.027 }, 00:21:27.027 { 00:21:27.027 "name": "BaseBdev3", 00:21:27.027 "uuid": "61cb1335-290f-5376-80cc-ebdf199fba3b", 00:21:27.027 "is_configured": true, 00:21:27.027 "data_offset": 2048, 00:21:27.027 "data_size": 63488 00:21:27.027 }, 00:21:27.027 { 00:21:27.027 "name": "BaseBdev4", 00:21:27.027 "uuid": "2e857e3f-eb96-58ed-8229-21a11ce5e385", 00:21:27.027 "is_configured": true, 00:21:27.027 "data_offset": 2048, 00:21:27.027 "data_size": 63488 00:21:27.027 } 00:21:27.027 ] 00:21:27.027 }' 00:21:27.027 10:29:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.027 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:27.593 10:29:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:27.593 [2024-07-15 10:29:04.767058] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:27.593 [2024-07-15 10:29:04.767093] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:27.593 [2024-07-15 10:29:04.770270] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:27.593 [2024-07-15 10:29:04.770309] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:27.593 [2024-07-15 10:29:04.770350] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:27.593 [2024-07-15 10:29:04.770361] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1db0c20 name raid_bdev1, state offline 00:21:27.593 0 00:21:27.593 10:29:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 562209 00:21:27.593 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 562209 ']' 00:21:27.593 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 562209 00:21:27.593 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:21:27.593 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:27.852 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 562209 00:21:27.852 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:27.852 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:27.852 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 562209' 00:21:27.852 killing process with pid 562209 00:21:27.852 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 562209 00:21:27.852 [2024-07-15 10:29:04.831727] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:27.852 10:29:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 562209 00:21:27.852 [2024-07-15 10:29:04.862205] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.80NLDMZkar 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:21:28.110 00:21:28.110 real 0m7.553s 00:21:28.110 user 0m12.087s 00:21:28.110 sys 0m1.307s 00:21:28.110 10:29:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:28.111 10:29:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:28.111 ************************************ 00:21:28.111 END TEST raid_read_error_test 00:21:28.111 ************************************ 00:21:28.111 10:29:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:28.111 10:29:05 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:21:28.111 10:29:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:28.111 10:29:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:28.111 10:29:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:28.111 ************************************ 00:21:28.111 START TEST raid_write_error_test 00:21:28.111 ************************************ 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.MwTzOHz1nI 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=563203 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 563203 /var/tmp/spdk-raid.sock 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 563203 ']' 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:28.111 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:28.111 10:29:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:28.111 [2024-07-15 10:29:05.218093] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:21:28.111 [2024-07-15 10:29:05.218144] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid563203 ] 00:21:28.394 [2024-07-15 10:29:05.330135] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:28.394 [2024-07-15 10:29:05.439466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:28.394 [2024-07-15 10:29:05.509945] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:28.394 [2024-07-15 10:29:05.509984] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:28.959 10:29:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:28.959 10:29:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:21:28.959 10:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:28.959 10:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:29.217 BaseBdev1_malloc 00:21:29.217 10:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:29.475 true 00:21:29.475 10:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:29.733 [2024-07-15 10:29:06.776334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:29.734 [2024-07-15 10:29:06.776377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:29.734 [2024-07-15 10:29:06.776398] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x244b0d0 00:21:29.734 [2024-07-15 10:29:06.776411] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:29.734 [2024-07-15 10:29:06.778265] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:29.734 [2024-07-15 10:29:06.778296] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:29.734 BaseBdev1 00:21:29.734 10:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:29.734 10:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:29.992 BaseBdev2_malloc 00:21:29.992 10:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:29.992 true 00:21:30.250 10:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:30.250 [2024-07-15 10:29:07.427931] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:30.250 [2024-07-15 10:29:07.427979] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.250 [2024-07-15 10:29:07.427999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x244f910 00:21:30.250 [2024-07-15 10:29:07.428012] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.250 [2024-07-15 10:29:07.429629] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.250 [2024-07-15 10:29:07.429660] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:30.250 BaseBdev2 00:21:30.508 10:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:30.508 10:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:30.508 BaseBdev3_malloc 00:21:30.508 10:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:30.766 true 00:21:30.766 10:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:31.024 [2024-07-15 10:29:08.071456] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:31.024 [2024-07-15 10:29:08.071502] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.024 [2024-07-15 10:29:08.071522] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2451bd0 00:21:31.024 [2024-07-15 10:29:08.071535] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.024 [2024-07-15 10:29:08.073101] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.024 [2024-07-15 10:29:08.073130] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:31.024 BaseBdev3 00:21:31.024 10:29:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:31.024 10:29:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:31.282 BaseBdev4_malloc 00:21:31.282 10:29:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:31.540 true 00:21:31.540 10:29:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:31.798 [2024-07-15 10:29:08.781904] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:31.798 [2024-07-15 10:29:08.781950] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:31.798 [2024-07-15 10:29:08.781971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2452aa0 00:21:31.798 [2024-07-15 10:29:08.781983] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:31.798 [2024-07-15 10:29:08.783570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:31.798 [2024-07-15 10:29:08.783599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:31.798 BaseBdev4 00:21:31.798 10:29:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:32.056 [2024-07-15 10:29:09.014559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:32.056 [2024-07-15 10:29:09.015939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:32.056 [2024-07-15 10:29:09.016007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:32.056 [2024-07-15 10:29:09.016070] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:32.056 [2024-07-15 10:29:09.016315] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x244cc20 00:21:32.056 [2024-07-15 10:29:09.016327] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:32.056 [2024-07-15 10:29:09.016530] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22a1260 00:21:32.056 [2024-07-15 10:29:09.016684] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x244cc20 00:21:32.056 [2024-07-15 10:29:09.016694] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x244cc20 00:21:32.056 [2024-07-15 10:29:09.016799] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.056 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:32.056 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.057 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.315 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:32.315 "name": "raid_bdev1", 00:21:32.315 "uuid": "c92a1fac-4c32-4412-9185-275ca77dbd2c", 00:21:32.315 "strip_size_kb": 64, 00:21:32.315 "state": "online", 00:21:32.315 "raid_level": "concat", 00:21:32.315 "superblock": true, 00:21:32.315 "num_base_bdevs": 4, 00:21:32.315 "num_base_bdevs_discovered": 4, 00:21:32.315 "num_base_bdevs_operational": 4, 00:21:32.315 "base_bdevs_list": [ 00:21:32.315 { 00:21:32.315 "name": "BaseBdev1", 00:21:32.315 "uuid": "613977ee-e36d-511a-929d-1b3b5afff577", 00:21:32.315 "is_configured": true, 00:21:32.315 "data_offset": 2048, 00:21:32.315 "data_size": 63488 00:21:32.315 }, 00:21:32.315 { 00:21:32.315 "name": "BaseBdev2", 00:21:32.315 "uuid": "5d53b2af-6772-52dd-97d2-abf2dc5411c3", 00:21:32.315 "is_configured": true, 00:21:32.316 "data_offset": 2048, 00:21:32.316 "data_size": 63488 00:21:32.316 }, 00:21:32.316 { 00:21:32.316 "name": "BaseBdev3", 00:21:32.316 "uuid": "61c4fc73-a0c0-5790-bca6-80fce2c05f98", 00:21:32.316 "is_configured": true, 00:21:32.316 "data_offset": 2048, 00:21:32.316 "data_size": 63488 00:21:32.316 }, 00:21:32.316 { 00:21:32.316 "name": "BaseBdev4", 00:21:32.316 "uuid": "bef6d72e-878b-5183-a149-28b61927e36b", 00:21:32.316 "is_configured": true, 00:21:32.316 "data_offset": 2048, 00:21:32.316 "data_size": 63488 00:21:32.316 } 00:21:32.316 ] 00:21:32.316 }' 00:21:32.316 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:32.316 10:29:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:32.883 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:32.883 10:29:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:32.883 [2024-07-15 10:29:09.929269] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x243efc0 00:21:33.819 10:29:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:34.076 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.077 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:34.346 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:34.346 "name": "raid_bdev1", 00:21:34.346 "uuid": "c92a1fac-4c32-4412-9185-275ca77dbd2c", 00:21:34.346 "strip_size_kb": 64, 00:21:34.346 "state": "online", 00:21:34.346 "raid_level": "concat", 00:21:34.346 "superblock": true, 00:21:34.346 "num_base_bdevs": 4, 00:21:34.346 "num_base_bdevs_discovered": 4, 00:21:34.346 "num_base_bdevs_operational": 4, 00:21:34.346 "base_bdevs_list": [ 00:21:34.346 { 00:21:34.346 "name": "BaseBdev1", 00:21:34.346 "uuid": "613977ee-e36d-511a-929d-1b3b5afff577", 00:21:34.346 "is_configured": true, 00:21:34.346 "data_offset": 2048, 00:21:34.346 "data_size": 63488 00:21:34.347 }, 00:21:34.347 { 00:21:34.347 "name": "BaseBdev2", 00:21:34.347 "uuid": "5d53b2af-6772-52dd-97d2-abf2dc5411c3", 00:21:34.347 "is_configured": true, 00:21:34.347 "data_offset": 2048, 00:21:34.347 "data_size": 63488 00:21:34.347 }, 00:21:34.347 { 00:21:34.347 "name": "BaseBdev3", 00:21:34.347 "uuid": "61c4fc73-a0c0-5790-bca6-80fce2c05f98", 00:21:34.347 "is_configured": true, 00:21:34.347 "data_offset": 2048, 00:21:34.347 "data_size": 63488 00:21:34.347 }, 00:21:34.347 { 00:21:34.347 "name": "BaseBdev4", 00:21:34.347 "uuid": "bef6d72e-878b-5183-a149-28b61927e36b", 00:21:34.347 "is_configured": true, 00:21:34.347 "data_offset": 2048, 00:21:34.347 "data_size": 63488 00:21:34.347 } 00:21:34.347 ] 00:21:34.347 }' 00:21:34.347 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:34.347 10:29:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:34.927 10:29:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:35.186 [2024-07-15 10:29:12.154825] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:35.186 [2024-07-15 10:29:12.154859] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:35.186 [2024-07-15 10:29:12.158029] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:35.186 [2024-07-15 10:29:12.158067] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:35.186 [2024-07-15 10:29:12.158108] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:35.186 [2024-07-15 10:29:12.158119] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x244cc20 name raid_bdev1, state offline 00:21:35.186 0 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 563203 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 563203 ']' 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 563203 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 563203 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 563203' 00:21:35.186 killing process with pid 563203 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 563203 00:21:35.186 [2024-07-15 10:29:12.227191] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:35.186 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 563203 00:21:35.186 [2024-07-15 10:29:12.257905] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.MwTzOHz1nI 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:21:35.446 00:21:35.446 real 0m7.313s 00:21:35.446 user 0m11.701s 00:21:35.446 sys 0m1.222s 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:35.446 10:29:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:35.446 ************************************ 00:21:35.446 END TEST raid_write_error_test 00:21:35.446 ************************************ 00:21:35.446 10:29:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:35.446 10:29:12 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:21:35.446 10:29:12 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:21:35.446 10:29:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:35.446 10:29:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:35.446 10:29:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:35.446 ************************************ 00:21:35.446 START TEST raid_state_function_test 00:21:35.446 ************************************ 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=564337 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 564337' 00:21:35.446 Process raid pid: 564337 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 564337 /var/tmp/spdk-raid.sock 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 564337 ']' 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:35.446 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:35.446 10:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:35.705 [2024-07-15 10:29:12.645640] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:21:35.705 [2024-07-15 10:29:12.645714] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:35.705 [2024-07-15 10:29:12.775001] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:35.705 [2024-07-15 10:29:12.880648] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:35.963 [2024-07-15 10:29:12.943946] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:35.963 [2024-07-15 10:29:12.943981] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:36.536 10:29:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:36.536 10:29:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:21:36.536 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:36.793 [2024-07-15 10:29:13.799517] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:36.793 [2024-07-15 10:29:13.799559] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:36.793 [2024-07-15 10:29:13.799570] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:36.793 [2024-07-15 10:29:13.799582] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:36.793 [2024-07-15 10:29:13.799591] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:36.793 [2024-07-15 10:29:13.799602] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:36.793 [2024-07-15 10:29:13.799611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:36.793 [2024-07-15 10:29:13.799622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.793 10:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:37.050 10:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:37.050 "name": "Existed_Raid", 00:21:37.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.050 "strip_size_kb": 0, 00:21:37.050 "state": "configuring", 00:21:37.050 "raid_level": "raid1", 00:21:37.050 "superblock": false, 00:21:37.050 "num_base_bdevs": 4, 00:21:37.050 "num_base_bdevs_discovered": 0, 00:21:37.050 "num_base_bdevs_operational": 4, 00:21:37.050 "base_bdevs_list": [ 00:21:37.050 { 00:21:37.050 "name": "BaseBdev1", 00:21:37.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.050 "is_configured": false, 00:21:37.050 "data_offset": 0, 00:21:37.050 "data_size": 0 00:21:37.050 }, 00:21:37.050 { 00:21:37.050 "name": "BaseBdev2", 00:21:37.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.050 "is_configured": false, 00:21:37.050 "data_offset": 0, 00:21:37.050 "data_size": 0 00:21:37.050 }, 00:21:37.050 { 00:21:37.050 "name": "BaseBdev3", 00:21:37.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.050 "is_configured": false, 00:21:37.050 "data_offset": 0, 00:21:37.050 "data_size": 0 00:21:37.050 }, 00:21:37.050 { 00:21:37.050 "name": "BaseBdev4", 00:21:37.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.050 "is_configured": false, 00:21:37.050 "data_offset": 0, 00:21:37.050 "data_size": 0 00:21:37.050 } 00:21:37.050 ] 00:21:37.050 }' 00:21:37.050 10:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:37.050 10:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:37.614 10:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:37.872 [2024-07-15 10:29:14.894283] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:37.872 [2024-07-15 10:29:14.894311] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e9aa0 name Existed_Raid, state configuring 00:21:37.872 10:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:38.130 [2024-07-15 10:29:15.138947] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:38.130 [2024-07-15 10:29:15.138974] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:38.130 [2024-07-15 10:29:15.138984] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:38.130 [2024-07-15 10:29:15.138995] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:38.130 [2024-07-15 10:29:15.139004] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:38.130 [2024-07-15 10:29:15.139015] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:38.130 [2024-07-15 10:29:15.139024] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:38.130 [2024-07-15 10:29:15.139036] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:38.130 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:38.387 [2024-07-15 10:29:15.393512] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:38.387 BaseBdev1 00:21:38.387 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:38.387 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:38.387 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:38.387 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:38.387 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:38.387 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:38.387 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:38.644 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:38.902 [ 00:21:38.902 { 00:21:38.902 "name": "BaseBdev1", 00:21:38.902 "aliases": [ 00:21:38.902 "89f34b34-0ae6-46b2-b72a-b45e5efc294d" 00:21:38.902 ], 00:21:38.902 "product_name": "Malloc disk", 00:21:38.902 "block_size": 512, 00:21:38.902 "num_blocks": 65536, 00:21:38.902 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:38.902 "assigned_rate_limits": { 00:21:38.902 "rw_ios_per_sec": 0, 00:21:38.902 "rw_mbytes_per_sec": 0, 00:21:38.902 "r_mbytes_per_sec": 0, 00:21:38.902 "w_mbytes_per_sec": 0 00:21:38.902 }, 00:21:38.902 "claimed": true, 00:21:38.902 "claim_type": "exclusive_write", 00:21:38.902 "zoned": false, 00:21:38.902 "supported_io_types": { 00:21:38.902 "read": true, 00:21:38.902 "write": true, 00:21:38.902 "unmap": true, 00:21:38.902 "flush": true, 00:21:38.902 "reset": true, 00:21:38.902 "nvme_admin": false, 00:21:38.902 "nvme_io": false, 00:21:38.902 "nvme_io_md": false, 00:21:38.902 "write_zeroes": true, 00:21:38.902 "zcopy": true, 00:21:38.902 "get_zone_info": false, 00:21:38.902 "zone_management": false, 00:21:38.902 "zone_append": false, 00:21:38.902 "compare": false, 00:21:38.902 "compare_and_write": false, 00:21:38.902 "abort": true, 00:21:38.902 "seek_hole": false, 00:21:38.902 "seek_data": false, 00:21:38.902 "copy": true, 00:21:38.902 "nvme_iov_md": false 00:21:38.902 }, 00:21:38.902 "memory_domains": [ 00:21:38.902 { 00:21:38.902 "dma_device_id": "system", 00:21:38.902 "dma_device_type": 1 00:21:38.902 }, 00:21:38.902 { 00:21:38.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.902 "dma_device_type": 2 00:21:38.902 } 00:21:38.902 ], 00:21:38.902 "driver_specific": {} 00:21:38.902 } 00:21:38.902 ] 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.902 10:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:39.160 10:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.160 "name": "Existed_Raid", 00:21:39.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.160 "strip_size_kb": 0, 00:21:39.160 "state": "configuring", 00:21:39.160 "raid_level": "raid1", 00:21:39.160 "superblock": false, 00:21:39.160 "num_base_bdevs": 4, 00:21:39.160 "num_base_bdevs_discovered": 1, 00:21:39.160 "num_base_bdevs_operational": 4, 00:21:39.160 "base_bdevs_list": [ 00:21:39.160 { 00:21:39.160 "name": "BaseBdev1", 00:21:39.160 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:39.160 "is_configured": true, 00:21:39.160 "data_offset": 0, 00:21:39.160 "data_size": 65536 00:21:39.160 }, 00:21:39.160 { 00:21:39.160 "name": "BaseBdev2", 00:21:39.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.160 "is_configured": false, 00:21:39.160 "data_offset": 0, 00:21:39.160 "data_size": 0 00:21:39.160 }, 00:21:39.160 { 00:21:39.160 "name": "BaseBdev3", 00:21:39.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.160 "is_configured": false, 00:21:39.160 "data_offset": 0, 00:21:39.160 "data_size": 0 00:21:39.160 }, 00:21:39.160 { 00:21:39.160 "name": "BaseBdev4", 00:21:39.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.160 "is_configured": false, 00:21:39.160 "data_offset": 0, 00:21:39.160 "data_size": 0 00:21:39.160 } 00:21:39.160 ] 00:21:39.160 }' 00:21:39.160 10:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.160 10:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.726 10:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:39.984 [2024-07-15 10:29:16.969675] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:39.984 [2024-07-15 10:29:16.969715] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e9310 name Existed_Raid, state configuring 00:21:39.984 10:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:40.243 [2024-07-15 10:29:17.214354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:40.243 [2024-07-15 10:29:17.215825] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:40.243 [2024-07-15 10:29:17.215858] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:40.243 [2024-07-15 10:29:17.215869] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:40.243 [2024-07-15 10:29:17.215881] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:40.243 [2024-07-15 10:29:17.215890] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:40.243 [2024-07-15 10:29:17.215901] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.243 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:40.501 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.502 "name": "Existed_Raid", 00:21:40.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.502 "strip_size_kb": 0, 00:21:40.502 "state": "configuring", 00:21:40.502 "raid_level": "raid1", 00:21:40.502 "superblock": false, 00:21:40.502 "num_base_bdevs": 4, 00:21:40.502 "num_base_bdevs_discovered": 1, 00:21:40.502 "num_base_bdevs_operational": 4, 00:21:40.502 "base_bdevs_list": [ 00:21:40.502 { 00:21:40.502 "name": "BaseBdev1", 00:21:40.502 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:40.502 "is_configured": true, 00:21:40.502 "data_offset": 0, 00:21:40.502 "data_size": 65536 00:21:40.502 }, 00:21:40.502 { 00:21:40.502 "name": "BaseBdev2", 00:21:40.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.502 "is_configured": false, 00:21:40.502 "data_offset": 0, 00:21:40.502 "data_size": 0 00:21:40.502 }, 00:21:40.502 { 00:21:40.502 "name": "BaseBdev3", 00:21:40.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.502 "is_configured": false, 00:21:40.502 "data_offset": 0, 00:21:40.502 "data_size": 0 00:21:40.502 }, 00:21:40.502 { 00:21:40.502 "name": "BaseBdev4", 00:21:40.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.502 "is_configured": false, 00:21:40.502 "data_offset": 0, 00:21:40.502 "data_size": 0 00:21:40.502 } 00:21:40.502 ] 00:21:40.502 }' 00:21:40.502 10:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.502 10:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:41.066 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:41.322 [2024-07-15 10:29:18.332740] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:41.322 BaseBdev2 00:21:41.322 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:41.322 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:41.322 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:41.322 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:41.322 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:41.322 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:41.323 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:41.579 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:41.579 [ 00:21:41.579 { 00:21:41.579 "name": "BaseBdev2", 00:21:41.579 "aliases": [ 00:21:41.579 "18614096-7d73-4ace-afe8-deb008213492" 00:21:41.579 ], 00:21:41.579 "product_name": "Malloc disk", 00:21:41.579 "block_size": 512, 00:21:41.579 "num_blocks": 65536, 00:21:41.579 "uuid": "18614096-7d73-4ace-afe8-deb008213492", 00:21:41.579 "assigned_rate_limits": { 00:21:41.579 "rw_ios_per_sec": 0, 00:21:41.579 "rw_mbytes_per_sec": 0, 00:21:41.579 "r_mbytes_per_sec": 0, 00:21:41.579 "w_mbytes_per_sec": 0 00:21:41.579 }, 00:21:41.579 "claimed": true, 00:21:41.579 "claim_type": "exclusive_write", 00:21:41.579 "zoned": false, 00:21:41.579 "supported_io_types": { 00:21:41.579 "read": true, 00:21:41.579 "write": true, 00:21:41.579 "unmap": true, 00:21:41.579 "flush": true, 00:21:41.579 "reset": true, 00:21:41.579 "nvme_admin": false, 00:21:41.579 "nvme_io": false, 00:21:41.579 "nvme_io_md": false, 00:21:41.579 "write_zeroes": true, 00:21:41.579 "zcopy": true, 00:21:41.579 "get_zone_info": false, 00:21:41.579 "zone_management": false, 00:21:41.579 "zone_append": false, 00:21:41.579 "compare": false, 00:21:41.579 "compare_and_write": false, 00:21:41.579 "abort": true, 00:21:41.579 "seek_hole": false, 00:21:41.579 "seek_data": false, 00:21:41.579 "copy": true, 00:21:41.579 "nvme_iov_md": false 00:21:41.579 }, 00:21:41.579 "memory_domains": [ 00:21:41.579 { 00:21:41.579 "dma_device_id": "system", 00:21:41.579 "dma_device_type": 1 00:21:41.579 }, 00:21:41.579 { 00:21:41.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:41.579 "dma_device_type": 2 00:21:41.579 } 00:21:41.579 ], 00:21:41.579 "driver_specific": {} 00:21:41.579 } 00:21:41.579 ] 00:21:41.579 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:41.579 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:41.579 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:41.579 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:41.580 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.837 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.837 "name": "Existed_Raid", 00:21:41.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.837 "strip_size_kb": 0, 00:21:41.837 "state": "configuring", 00:21:41.837 "raid_level": "raid1", 00:21:41.837 "superblock": false, 00:21:41.837 "num_base_bdevs": 4, 00:21:41.837 "num_base_bdevs_discovered": 2, 00:21:41.837 "num_base_bdevs_operational": 4, 00:21:41.837 "base_bdevs_list": [ 00:21:41.837 { 00:21:41.837 "name": "BaseBdev1", 00:21:41.837 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:41.837 "is_configured": true, 00:21:41.837 "data_offset": 0, 00:21:41.837 "data_size": 65536 00:21:41.837 }, 00:21:41.837 { 00:21:41.837 "name": "BaseBdev2", 00:21:41.837 "uuid": "18614096-7d73-4ace-afe8-deb008213492", 00:21:41.837 "is_configured": true, 00:21:41.837 "data_offset": 0, 00:21:41.837 "data_size": 65536 00:21:41.837 }, 00:21:41.837 { 00:21:41.837 "name": "BaseBdev3", 00:21:41.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.837 "is_configured": false, 00:21:41.837 "data_offset": 0, 00:21:41.837 "data_size": 0 00:21:41.837 }, 00:21:41.837 { 00:21:41.837 "name": "BaseBdev4", 00:21:41.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:41.837 "is_configured": false, 00:21:41.837 "data_offset": 0, 00:21:41.837 "data_size": 0 00:21:41.837 } 00:21:41.837 ] 00:21:41.837 }' 00:21:41.837 10:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.837 10:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:42.403 10:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:42.661 [2024-07-15 10:29:19.720993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:42.661 BaseBdev3 00:21:42.661 10:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:42.661 10:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:42.661 10:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:42.661 10:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:42.661 10:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:42.661 10:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:42.661 10:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:42.920 10:29:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:42.920 [ 00:21:42.920 { 00:21:42.920 "name": "BaseBdev3", 00:21:42.920 "aliases": [ 00:21:42.920 "b8d4c0c1-964c-4a31-945e-b1f2510e989f" 00:21:42.920 ], 00:21:42.920 "product_name": "Malloc disk", 00:21:42.920 "block_size": 512, 00:21:42.920 "num_blocks": 65536, 00:21:42.920 "uuid": "b8d4c0c1-964c-4a31-945e-b1f2510e989f", 00:21:42.920 "assigned_rate_limits": { 00:21:42.920 "rw_ios_per_sec": 0, 00:21:42.920 "rw_mbytes_per_sec": 0, 00:21:42.920 "r_mbytes_per_sec": 0, 00:21:42.920 "w_mbytes_per_sec": 0 00:21:42.920 }, 00:21:42.920 "claimed": true, 00:21:42.920 "claim_type": "exclusive_write", 00:21:42.920 "zoned": false, 00:21:42.920 "supported_io_types": { 00:21:42.920 "read": true, 00:21:42.920 "write": true, 00:21:42.920 "unmap": true, 00:21:42.920 "flush": true, 00:21:42.920 "reset": true, 00:21:42.920 "nvme_admin": false, 00:21:42.920 "nvme_io": false, 00:21:42.920 "nvme_io_md": false, 00:21:42.920 "write_zeroes": true, 00:21:42.920 "zcopy": true, 00:21:42.920 "get_zone_info": false, 00:21:42.920 "zone_management": false, 00:21:42.920 "zone_append": false, 00:21:42.920 "compare": false, 00:21:42.920 "compare_and_write": false, 00:21:42.920 "abort": true, 00:21:42.920 "seek_hole": false, 00:21:42.920 "seek_data": false, 00:21:42.920 "copy": true, 00:21:42.920 "nvme_iov_md": false 00:21:42.920 }, 00:21:42.920 "memory_domains": [ 00:21:42.920 { 00:21:42.920 "dma_device_id": "system", 00:21:42.920 "dma_device_type": 1 00:21:42.920 }, 00:21:42.920 { 00:21:42.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.920 "dma_device_type": 2 00:21:42.920 } 00:21:42.920 ], 00:21:42.920 "driver_specific": {} 00:21:42.920 } 00:21:42.920 ] 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.920 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:43.177 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.177 "name": "Existed_Raid", 00:21:43.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.177 "strip_size_kb": 0, 00:21:43.177 "state": "configuring", 00:21:43.177 "raid_level": "raid1", 00:21:43.177 "superblock": false, 00:21:43.177 "num_base_bdevs": 4, 00:21:43.177 "num_base_bdevs_discovered": 3, 00:21:43.177 "num_base_bdevs_operational": 4, 00:21:43.177 "base_bdevs_list": [ 00:21:43.177 { 00:21:43.177 "name": "BaseBdev1", 00:21:43.177 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:43.177 "is_configured": true, 00:21:43.177 "data_offset": 0, 00:21:43.177 "data_size": 65536 00:21:43.177 }, 00:21:43.177 { 00:21:43.177 "name": "BaseBdev2", 00:21:43.177 "uuid": "18614096-7d73-4ace-afe8-deb008213492", 00:21:43.177 "is_configured": true, 00:21:43.177 "data_offset": 0, 00:21:43.177 "data_size": 65536 00:21:43.177 }, 00:21:43.177 { 00:21:43.177 "name": "BaseBdev3", 00:21:43.177 "uuid": "b8d4c0c1-964c-4a31-945e-b1f2510e989f", 00:21:43.177 "is_configured": true, 00:21:43.177 "data_offset": 0, 00:21:43.177 "data_size": 65536 00:21:43.177 }, 00:21:43.177 { 00:21:43.177 "name": "BaseBdev4", 00:21:43.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:43.177 "is_configured": false, 00:21:43.177 "data_offset": 0, 00:21:43.177 "data_size": 0 00:21:43.177 } 00:21:43.177 ] 00:21:43.178 }' 00:21:43.178 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.178 10:29:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:43.743 10:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:44.001 [2024-07-15 10:29:21.096130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:44.001 [2024-07-15 10:29:21.096171] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16ea350 00:21:44.001 [2024-07-15 10:29:21.096179] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:44.001 [2024-07-15 10:29:21.096432] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16ea020 00:21:44.001 [2024-07-15 10:29:21.096559] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16ea350 00:21:44.001 [2024-07-15 10:29:21.096569] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16ea350 00:21:44.001 [2024-07-15 10:29:21.096728] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:44.001 BaseBdev4 00:21:44.001 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:44.001 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:44.001 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:44.001 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:44.001 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:44.001 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:44.001 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:44.258 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:44.258 [ 00:21:44.258 { 00:21:44.258 "name": "BaseBdev4", 00:21:44.258 "aliases": [ 00:21:44.258 "98e8323e-4248-4f6f-ba7a-350aeef11be0" 00:21:44.258 ], 00:21:44.258 "product_name": "Malloc disk", 00:21:44.258 "block_size": 512, 00:21:44.259 "num_blocks": 65536, 00:21:44.259 "uuid": "98e8323e-4248-4f6f-ba7a-350aeef11be0", 00:21:44.259 "assigned_rate_limits": { 00:21:44.259 "rw_ios_per_sec": 0, 00:21:44.259 "rw_mbytes_per_sec": 0, 00:21:44.259 "r_mbytes_per_sec": 0, 00:21:44.259 "w_mbytes_per_sec": 0 00:21:44.259 }, 00:21:44.259 "claimed": true, 00:21:44.259 "claim_type": "exclusive_write", 00:21:44.259 "zoned": false, 00:21:44.259 "supported_io_types": { 00:21:44.259 "read": true, 00:21:44.259 "write": true, 00:21:44.259 "unmap": true, 00:21:44.259 "flush": true, 00:21:44.259 "reset": true, 00:21:44.259 "nvme_admin": false, 00:21:44.259 "nvme_io": false, 00:21:44.259 "nvme_io_md": false, 00:21:44.259 "write_zeroes": true, 00:21:44.259 "zcopy": true, 00:21:44.259 "get_zone_info": false, 00:21:44.259 "zone_management": false, 00:21:44.259 "zone_append": false, 00:21:44.259 "compare": false, 00:21:44.259 "compare_and_write": false, 00:21:44.259 "abort": true, 00:21:44.259 "seek_hole": false, 00:21:44.259 "seek_data": false, 00:21:44.259 "copy": true, 00:21:44.259 "nvme_iov_md": false 00:21:44.259 }, 00:21:44.259 "memory_domains": [ 00:21:44.259 { 00:21:44.259 "dma_device_id": "system", 00:21:44.259 "dma_device_type": 1 00:21:44.259 }, 00:21:44.259 { 00:21:44.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.259 "dma_device_type": 2 00:21:44.259 } 00:21:44.259 ], 00:21:44.259 "driver_specific": {} 00:21:44.259 } 00:21:44.259 ] 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:44.517 "name": "Existed_Raid", 00:21:44.517 "uuid": "e0a2cd22-2501-4f92-bd60-093ac0b52c98", 00:21:44.517 "strip_size_kb": 0, 00:21:44.517 "state": "online", 00:21:44.517 "raid_level": "raid1", 00:21:44.517 "superblock": false, 00:21:44.517 "num_base_bdevs": 4, 00:21:44.517 "num_base_bdevs_discovered": 4, 00:21:44.517 "num_base_bdevs_operational": 4, 00:21:44.517 "base_bdevs_list": [ 00:21:44.517 { 00:21:44.517 "name": "BaseBdev1", 00:21:44.517 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:44.517 "is_configured": true, 00:21:44.517 "data_offset": 0, 00:21:44.517 "data_size": 65536 00:21:44.517 }, 00:21:44.517 { 00:21:44.517 "name": "BaseBdev2", 00:21:44.517 "uuid": "18614096-7d73-4ace-afe8-deb008213492", 00:21:44.517 "is_configured": true, 00:21:44.517 "data_offset": 0, 00:21:44.517 "data_size": 65536 00:21:44.517 }, 00:21:44.517 { 00:21:44.517 "name": "BaseBdev3", 00:21:44.517 "uuid": "b8d4c0c1-964c-4a31-945e-b1f2510e989f", 00:21:44.517 "is_configured": true, 00:21:44.517 "data_offset": 0, 00:21:44.517 "data_size": 65536 00:21:44.517 }, 00:21:44.517 { 00:21:44.517 "name": "BaseBdev4", 00:21:44.517 "uuid": "98e8323e-4248-4f6f-ba7a-350aeef11be0", 00:21:44.517 "is_configured": true, 00:21:44.517 "data_offset": 0, 00:21:44.517 "data_size": 65536 00:21:44.517 } 00:21:44.517 ] 00:21:44.517 }' 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:44.517 10:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:45.083 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:45.341 [2024-07-15 10:29:22.440027] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:45.341 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:45.341 "name": "Existed_Raid", 00:21:45.341 "aliases": [ 00:21:45.341 "e0a2cd22-2501-4f92-bd60-093ac0b52c98" 00:21:45.341 ], 00:21:45.341 "product_name": "Raid Volume", 00:21:45.341 "block_size": 512, 00:21:45.341 "num_blocks": 65536, 00:21:45.341 "uuid": "e0a2cd22-2501-4f92-bd60-093ac0b52c98", 00:21:45.341 "assigned_rate_limits": { 00:21:45.341 "rw_ios_per_sec": 0, 00:21:45.341 "rw_mbytes_per_sec": 0, 00:21:45.341 "r_mbytes_per_sec": 0, 00:21:45.341 "w_mbytes_per_sec": 0 00:21:45.341 }, 00:21:45.341 "claimed": false, 00:21:45.341 "zoned": false, 00:21:45.341 "supported_io_types": { 00:21:45.341 "read": true, 00:21:45.341 "write": true, 00:21:45.341 "unmap": false, 00:21:45.341 "flush": false, 00:21:45.341 "reset": true, 00:21:45.341 "nvme_admin": false, 00:21:45.341 "nvme_io": false, 00:21:45.341 "nvme_io_md": false, 00:21:45.341 "write_zeroes": true, 00:21:45.341 "zcopy": false, 00:21:45.341 "get_zone_info": false, 00:21:45.341 "zone_management": false, 00:21:45.341 "zone_append": false, 00:21:45.341 "compare": false, 00:21:45.341 "compare_and_write": false, 00:21:45.341 "abort": false, 00:21:45.341 "seek_hole": false, 00:21:45.341 "seek_data": false, 00:21:45.341 "copy": false, 00:21:45.341 "nvme_iov_md": false 00:21:45.341 }, 00:21:45.341 "memory_domains": [ 00:21:45.341 { 00:21:45.341 "dma_device_id": "system", 00:21:45.341 "dma_device_type": 1 00:21:45.341 }, 00:21:45.341 { 00:21:45.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.341 "dma_device_type": 2 00:21:45.341 }, 00:21:45.341 { 00:21:45.342 "dma_device_id": "system", 00:21:45.342 "dma_device_type": 1 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.342 "dma_device_type": 2 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "dma_device_id": "system", 00:21:45.342 "dma_device_type": 1 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.342 "dma_device_type": 2 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "dma_device_id": "system", 00:21:45.342 "dma_device_type": 1 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.342 "dma_device_type": 2 00:21:45.342 } 00:21:45.342 ], 00:21:45.342 "driver_specific": { 00:21:45.342 "raid": { 00:21:45.342 "uuid": "e0a2cd22-2501-4f92-bd60-093ac0b52c98", 00:21:45.342 "strip_size_kb": 0, 00:21:45.342 "state": "online", 00:21:45.342 "raid_level": "raid1", 00:21:45.342 "superblock": false, 00:21:45.342 "num_base_bdevs": 4, 00:21:45.342 "num_base_bdevs_discovered": 4, 00:21:45.342 "num_base_bdevs_operational": 4, 00:21:45.342 "base_bdevs_list": [ 00:21:45.342 { 00:21:45.342 "name": "BaseBdev1", 00:21:45.342 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:45.342 "is_configured": true, 00:21:45.342 "data_offset": 0, 00:21:45.342 "data_size": 65536 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "name": "BaseBdev2", 00:21:45.342 "uuid": "18614096-7d73-4ace-afe8-deb008213492", 00:21:45.342 "is_configured": true, 00:21:45.342 "data_offset": 0, 00:21:45.342 "data_size": 65536 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "name": "BaseBdev3", 00:21:45.342 "uuid": "b8d4c0c1-964c-4a31-945e-b1f2510e989f", 00:21:45.342 "is_configured": true, 00:21:45.342 "data_offset": 0, 00:21:45.342 "data_size": 65536 00:21:45.342 }, 00:21:45.342 { 00:21:45.342 "name": "BaseBdev4", 00:21:45.342 "uuid": "98e8323e-4248-4f6f-ba7a-350aeef11be0", 00:21:45.342 "is_configured": true, 00:21:45.342 "data_offset": 0, 00:21:45.342 "data_size": 65536 00:21:45.342 } 00:21:45.342 ] 00:21:45.342 } 00:21:45.342 } 00:21:45.342 }' 00:21:45.342 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:45.342 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:45.342 BaseBdev2 00:21:45.342 BaseBdev3 00:21:45.342 BaseBdev4' 00:21:45.342 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:45.342 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:45.342 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:45.600 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:45.600 "name": "BaseBdev1", 00:21:45.600 "aliases": [ 00:21:45.600 "89f34b34-0ae6-46b2-b72a-b45e5efc294d" 00:21:45.600 ], 00:21:45.600 "product_name": "Malloc disk", 00:21:45.600 "block_size": 512, 00:21:45.600 "num_blocks": 65536, 00:21:45.600 "uuid": "89f34b34-0ae6-46b2-b72a-b45e5efc294d", 00:21:45.600 "assigned_rate_limits": { 00:21:45.600 "rw_ios_per_sec": 0, 00:21:45.600 "rw_mbytes_per_sec": 0, 00:21:45.600 "r_mbytes_per_sec": 0, 00:21:45.600 "w_mbytes_per_sec": 0 00:21:45.600 }, 00:21:45.600 "claimed": true, 00:21:45.600 "claim_type": "exclusive_write", 00:21:45.600 "zoned": false, 00:21:45.600 "supported_io_types": { 00:21:45.600 "read": true, 00:21:45.600 "write": true, 00:21:45.600 "unmap": true, 00:21:45.600 "flush": true, 00:21:45.600 "reset": true, 00:21:45.600 "nvme_admin": false, 00:21:45.600 "nvme_io": false, 00:21:45.600 "nvme_io_md": false, 00:21:45.600 "write_zeroes": true, 00:21:45.600 "zcopy": true, 00:21:45.600 "get_zone_info": false, 00:21:45.600 "zone_management": false, 00:21:45.600 "zone_append": false, 00:21:45.600 "compare": false, 00:21:45.600 "compare_and_write": false, 00:21:45.600 "abort": true, 00:21:45.600 "seek_hole": false, 00:21:45.600 "seek_data": false, 00:21:45.600 "copy": true, 00:21:45.600 "nvme_iov_md": false 00:21:45.600 }, 00:21:45.600 "memory_domains": [ 00:21:45.600 { 00:21:45.600 "dma_device_id": "system", 00:21:45.600 "dma_device_type": 1 00:21:45.600 }, 00:21:45.600 { 00:21:45.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.600 "dma_device_type": 2 00:21:45.600 } 00:21:45.600 ], 00:21:45.600 "driver_specific": {} 00:21:45.600 }' 00:21:45.601 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:45.858 10:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.858 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.858 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.858 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:46.115 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:46.115 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.115 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:46.115 "name": "BaseBdev2", 00:21:46.115 "aliases": [ 00:21:46.115 "18614096-7d73-4ace-afe8-deb008213492" 00:21:46.115 ], 00:21:46.115 "product_name": "Malloc disk", 00:21:46.115 "block_size": 512, 00:21:46.115 "num_blocks": 65536, 00:21:46.115 "uuid": "18614096-7d73-4ace-afe8-deb008213492", 00:21:46.115 "assigned_rate_limits": { 00:21:46.115 "rw_ios_per_sec": 0, 00:21:46.115 "rw_mbytes_per_sec": 0, 00:21:46.115 "r_mbytes_per_sec": 0, 00:21:46.115 "w_mbytes_per_sec": 0 00:21:46.115 }, 00:21:46.115 "claimed": true, 00:21:46.115 "claim_type": "exclusive_write", 00:21:46.115 "zoned": false, 00:21:46.115 "supported_io_types": { 00:21:46.115 "read": true, 00:21:46.115 "write": true, 00:21:46.115 "unmap": true, 00:21:46.115 "flush": true, 00:21:46.115 "reset": true, 00:21:46.115 "nvme_admin": false, 00:21:46.115 "nvme_io": false, 00:21:46.115 "nvme_io_md": false, 00:21:46.115 "write_zeroes": true, 00:21:46.115 "zcopy": true, 00:21:46.115 "get_zone_info": false, 00:21:46.115 "zone_management": false, 00:21:46.115 "zone_append": false, 00:21:46.115 "compare": false, 00:21:46.115 "compare_and_write": false, 00:21:46.115 "abort": true, 00:21:46.115 "seek_hole": false, 00:21:46.115 "seek_data": false, 00:21:46.115 "copy": true, 00:21:46.115 "nvme_iov_md": false 00:21:46.115 }, 00:21:46.115 "memory_domains": [ 00:21:46.115 { 00:21:46.115 "dma_device_id": "system", 00:21:46.115 "dma_device_type": 1 00:21:46.115 }, 00:21:46.115 { 00:21:46.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.115 "dma_device_type": 2 00:21:46.115 } 00:21:46.115 ], 00:21:46.115 "driver_specific": {} 00:21:46.115 }' 00:21:46.115 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.371 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.628 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.628 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.628 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:46.628 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:46.628 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.885 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:46.885 "name": "BaseBdev3", 00:21:46.885 "aliases": [ 00:21:46.885 "b8d4c0c1-964c-4a31-945e-b1f2510e989f" 00:21:46.885 ], 00:21:46.885 "product_name": "Malloc disk", 00:21:46.885 "block_size": 512, 00:21:46.885 "num_blocks": 65536, 00:21:46.885 "uuid": "b8d4c0c1-964c-4a31-945e-b1f2510e989f", 00:21:46.885 "assigned_rate_limits": { 00:21:46.885 "rw_ios_per_sec": 0, 00:21:46.885 "rw_mbytes_per_sec": 0, 00:21:46.885 "r_mbytes_per_sec": 0, 00:21:46.885 "w_mbytes_per_sec": 0 00:21:46.885 }, 00:21:46.886 "claimed": true, 00:21:46.886 "claim_type": "exclusive_write", 00:21:46.886 "zoned": false, 00:21:46.886 "supported_io_types": { 00:21:46.886 "read": true, 00:21:46.886 "write": true, 00:21:46.886 "unmap": true, 00:21:46.886 "flush": true, 00:21:46.886 "reset": true, 00:21:46.886 "nvme_admin": false, 00:21:46.886 "nvme_io": false, 00:21:46.886 "nvme_io_md": false, 00:21:46.886 "write_zeroes": true, 00:21:46.886 "zcopy": true, 00:21:46.886 "get_zone_info": false, 00:21:46.886 "zone_management": false, 00:21:46.886 "zone_append": false, 00:21:46.886 "compare": false, 00:21:46.886 "compare_and_write": false, 00:21:46.886 "abort": true, 00:21:46.886 "seek_hole": false, 00:21:46.886 "seek_data": false, 00:21:46.886 "copy": true, 00:21:46.886 "nvme_iov_md": false 00:21:46.886 }, 00:21:46.886 "memory_domains": [ 00:21:46.886 { 00:21:46.886 "dma_device_id": "system", 00:21:46.886 "dma_device_type": 1 00:21:46.886 }, 00:21:46.886 { 00:21:46.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.886 "dma_device_type": 2 00:21:46.886 } 00:21:46.886 ], 00:21:46.886 "driver_specific": {} 00:21:46.886 }' 00:21:46.886 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.886 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.886 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:46.886 10:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.886 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.886 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.886 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.886 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.143 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:47.143 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.143 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.143 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:47.143 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:47.143 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:47.143 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:47.400 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:47.400 "name": "BaseBdev4", 00:21:47.400 "aliases": [ 00:21:47.400 "98e8323e-4248-4f6f-ba7a-350aeef11be0" 00:21:47.400 ], 00:21:47.400 "product_name": "Malloc disk", 00:21:47.400 "block_size": 512, 00:21:47.400 "num_blocks": 65536, 00:21:47.400 "uuid": "98e8323e-4248-4f6f-ba7a-350aeef11be0", 00:21:47.400 "assigned_rate_limits": { 00:21:47.400 "rw_ios_per_sec": 0, 00:21:47.400 "rw_mbytes_per_sec": 0, 00:21:47.400 "r_mbytes_per_sec": 0, 00:21:47.400 "w_mbytes_per_sec": 0 00:21:47.400 }, 00:21:47.400 "claimed": true, 00:21:47.400 "claim_type": "exclusive_write", 00:21:47.400 "zoned": false, 00:21:47.400 "supported_io_types": { 00:21:47.400 "read": true, 00:21:47.400 "write": true, 00:21:47.400 "unmap": true, 00:21:47.400 "flush": true, 00:21:47.400 "reset": true, 00:21:47.400 "nvme_admin": false, 00:21:47.400 "nvme_io": false, 00:21:47.400 "nvme_io_md": false, 00:21:47.400 "write_zeroes": true, 00:21:47.400 "zcopy": true, 00:21:47.400 "get_zone_info": false, 00:21:47.400 "zone_management": false, 00:21:47.400 "zone_append": false, 00:21:47.400 "compare": false, 00:21:47.400 "compare_and_write": false, 00:21:47.400 "abort": true, 00:21:47.400 "seek_hole": false, 00:21:47.400 "seek_data": false, 00:21:47.400 "copy": true, 00:21:47.400 "nvme_iov_md": false 00:21:47.400 }, 00:21:47.400 "memory_domains": [ 00:21:47.400 { 00:21:47.400 "dma_device_id": "system", 00:21:47.400 "dma_device_type": 1 00:21:47.400 }, 00:21:47.400 { 00:21:47.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.400 "dma_device_type": 2 00:21:47.400 } 00:21:47.400 ], 00:21:47.400 "driver_specific": {} 00:21:47.400 }' 00:21:47.400 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.400 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.400 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:47.400 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.400 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:47.658 10:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:47.956 [2024-07-15 10:29:24.994545] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.956 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.236 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:48.236 "name": "Existed_Raid", 00:21:48.236 "uuid": "e0a2cd22-2501-4f92-bd60-093ac0b52c98", 00:21:48.236 "strip_size_kb": 0, 00:21:48.236 "state": "online", 00:21:48.236 "raid_level": "raid1", 00:21:48.236 "superblock": false, 00:21:48.236 "num_base_bdevs": 4, 00:21:48.236 "num_base_bdevs_discovered": 3, 00:21:48.236 "num_base_bdevs_operational": 3, 00:21:48.236 "base_bdevs_list": [ 00:21:48.236 { 00:21:48.236 "name": null, 00:21:48.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.236 "is_configured": false, 00:21:48.236 "data_offset": 0, 00:21:48.236 "data_size": 65536 00:21:48.236 }, 00:21:48.236 { 00:21:48.236 "name": "BaseBdev2", 00:21:48.236 "uuid": "18614096-7d73-4ace-afe8-deb008213492", 00:21:48.236 "is_configured": true, 00:21:48.236 "data_offset": 0, 00:21:48.236 "data_size": 65536 00:21:48.236 }, 00:21:48.236 { 00:21:48.236 "name": "BaseBdev3", 00:21:48.236 "uuid": "b8d4c0c1-964c-4a31-945e-b1f2510e989f", 00:21:48.236 "is_configured": true, 00:21:48.236 "data_offset": 0, 00:21:48.236 "data_size": 65536 00:21:48.236 }, 00:21:48.236 { 00:21:48.236 "name": "BaseBdev4", 00:21:48.236 "uuid": "98e8323e-4248-4f6f-ba7a-350aeef11be0", 00:21:48.236 "is_configured": true, 00:21:48.236 "data_offset": 0, 00:21:48.236 "data_size": 65536 00:21:48.236 } 00:21:48.236 ] 00:21:48.236 }' 00:21:48.236 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:48.236 10:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.801 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:48.801 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:48.801 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:48.801 10:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.058 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:49.058 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:49.059 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:49.317 [2024-07-15 10:29:26.331677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:49.317 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:49.317 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:49.317 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.317 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:49.576 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:49.576 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:49.576 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:49.576 [2024-07-15 10:29:26.707226] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:49.576 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:49.576 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:49.576 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.576 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:49.834 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:49.834 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:49.834 10:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:50.093 [2024-07-15 10:29:27.146908] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:50.093 [2024-07-15 10:29:27.146996] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:50.093 [2024-07-15 10:29:27.158380] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:50.093 [2024-07-15 10:29:27.158413] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:50.093 [2024-07-15 10:29:27.158424] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ea350 name Existed_Raid, state offline 00:21:50.093 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:50.093 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:50.093 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.093 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:50.351 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:50.351 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:50.351 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:50.351 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:50.351 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:50.351 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:50.610 BaseBdev2 00:21:50.610 10:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:50.610 10:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:50.610 10:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:50.610 10:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:50.610 10:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:50.610 10:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:50.610 10:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:50.869 10:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:51.128 [ 00:21:51.128 { 00:21:51.128 "name": "BaseBdev2", 00:21:51.128 "aliases": [ 00:21:51.128 "ff2e39c4-ad9c-458e-b049-0311d85b911c" 00:21:51.128 ], 00:21:51.128 "product_name": "Malloc disk", 00:21:51.128 "block_size": 512, 00:21:51.128 "num_blocks": 65536, 00:21:51.128 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:21:51.128 "assigned_rate_limits": { 00:21:51.128 "rw_ios_per_sec": 0, 00:21:51.128 "rw_mbytes_per_sec": 0, 00:21:51.128 "r_mbytes_per_sec": 0, 00:21:51.128 "w_mbytes_per_sec": 0 00:21:51.128 }, 00:21:51.128 "claimed": false, 00:21:51.128 "zoned": false, 00:21:51.128 "supported_io_types": { 00:21:51.128 "read": true, 00:21:51.128 "write": true, 00:21:51.128 "unmap": true, 00:21:51.128 "flush": true, 00:21:51.128 "reset": true, 00:21:51.128 "nvme_admin": false, 00:21:51.128 "nvme_io": false, 00:21:51.128 "nvme_io_md": false, 00:21:51.128 "write_zeroes": true, 00:21:51.128 "zcopy": true, 00:21:51.128 "get_zone_info": false, 00:21:51.128 "zone_management": false, 00:21:51.128 "zone_append": false, 00:21:51.128 "compare": false, 00:21:51.128 "compare_and_write": false, 00:21:51.128 "abort": true, 00:21:51.128 "seek_hole": false, 00:21:51.128 "seek_data": false, 00:21:51.128 "copy": true, 00:21:51.128 "nvme_iov_md": false 00:21:51.128 }, 00:21:51.128 "memory_domains": [ 00:21:51.128 { 00:21:51.128 "dma_device_id": "system", 00:21:51.128 "dma_device_type": 1 00:21:51.128 }, 00:21:51.128 { 00:21:51.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.128 "dma_device_type": 2 00:21:51.128 } 00:21:51.128 ], 00:21:51.128 "driver_specific": {} 00:21:51.128 } 00:21:51.128 ] 00:21:51.128 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:51.128 10:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:51.128 10:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:51.128 10:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:51.387 BaseBdev3 00:21:51.387 10:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:51.387 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:51.387 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:51.387 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:51.387 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:51.387 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:51.387 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:51.645 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:51.904 [ 00:21:51.904 { 00:21:51.904 "name": "BaseBdev3", 00:21:51.904 "aliases": [ 00:21:51.904 "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9" 00:21:51.904 ], 00:21:51.904 "product_name": "Malloc disk", 00:21:51.904 "block_size": 512, 00:21:51.904 "num_blocks": 65536, 00:21:51.904 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:21:51.904 "assigned_rate_limits": { 00:21:51.904 "rw_ios_per_sec": 0, 00:21:51.904 "rw_mbytes_per_sec": 0, 00:21:51.904 "r_mbytes_per_sec": 0, 00:21:51.904 "w_mbytes_per_sec": 0 00:21:51.904 }, 00:21:51.904 "claimed": false, 00:21:51.904 "zoned": false, 00:21:51.904 "supported_io_types": { 00:21:51.904 "read": true, 00:21:51.904 "write": true, 00:21:51.904 "unmap": true, 00:21:51.904 "flush": true, 00:21:51.904 "reset": true, 00:21:51.904 "nvme_admin": false, 00:21:51.904 "nvme_io": false, 00:21:51.904 "nvme_io_md": false, 00:21:51.904 "write_zeroes": true, 00:21:51.904 "zcopy": true, 00:21:51.904 "get_zone_info": false, 00:21:51.904 "zone_management": false, 00:21:51.904 "zone_append": false, 00:21:51.904 "compare": false, 00:21:51.904 "compare_and_write": false, 00:21:51.904 "abort": true, 00:21:51.904 "seek_hole": false, 00:21:51.904 "seek_data": false, 00:21:51.904 "copy": true, 00:21:51.904 "nvme_iov_md": false 00:21:51.904 }, 00:21:51.904 "memory_domains": [ 00:21:51.904 { 00:21:51.904 "dma_device_id": "system", 00:21:51.904 "dma_device_type": 1 00:21:51.904 }, 00:21:51.904 { 00:21:51.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:51.904 "dma_device_type": 2 00:21:51.904 } 00:21:51.904 ], 00:21:51.904 "driver_specific": {} 00:21:51.904 } 00:21:51.904 ] 00:21:51.904 10:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:51.904 10:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:51.904 10:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:51.904 10:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:52.162 BaseBdev4 00:21:52.162 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:52.162 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:52.162 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:52.162 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:52.162 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:52.162 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:52.162 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:52.419 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:52.677 [ 00:21:52.677 { 00:21:52.677 "name": "BaseBdev4", 00:21:52.677 "aliases": [ 00:21:52.677 "8e356111-bba5-40d1-b8ed-86171b899474" 00:21:52.677 ], 00:21:52.677 "product_name": "Malloc disk", 00:21:52.677 "block_size": 512, 00:21:52.677 "num_blocks": 65536, 00:21:52.677 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:21:52.677 "assigned_rate_limits": { 00:21:52.677 "rw_ios_per_sec": 0, 00:21:52.677 "rw_mbytes_per_sec": 0, 00:21:52.677 "r_mbytes_per_sec": 0, 00:21:52.677 "w_mbytes_per_sec": 0 00:21:52.677 }, 00:21:52.677 "claimed": false, 00:21:52.677 "zoned": false, 00:21:52.677 "supported_io_types": { 00:21:52.677 "read": true, 00:21:52.677 "write": true, 00:21:52.677 "unmap": true, 00:21:52.677 "flush": true, 00:21:52.677 "reset": true, 00:21:52.677 "nvme_admin": false, 00:21:52.677 "nvme_io": false, 00:21:52.677 "nvme_io_md": false, 00:21:52.677 "write_zeroes": true, 00:21:52.677 "zcopy": true, 00:21:52.677 "get_zone_info": false, 00:21:52.677 "zone_management": false, 00:21:52.677 "zone_append": false, 00:21:52.677 "compare": false, 00:21:52.677 "compare_and_write": false, 00:21:52.677 "abort": true, 00:21:52.677 "seek_hole": false, 00:21:52.677 "seek_data": false, 00:21:52.677 "copy": true, 00:21:52.677 "nvme_iov_md": false 00:21:52.677 }, 00:21:52.677 "memory_domains": [ 00:21:52.677 { 00:21:52.677 "dma_device_id": "system", 00:21:52.677 "dma_device_type": 1 00:21:52.677 }, 00:21:52.677 { 00:21:52.677 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:52.677 "dma_device_type": 2 00:21:52.677 } 00:21:52.677 ], 00:21:52.677 "driver_specific": {} 00:21:52.677 } 00:21:52.677 ] 00:21:52.677 10:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:52.677 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:52.677 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:52.677 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:52.677 [2024-07-15 10:29:29.859687] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:52.677 [2024-07-15 10:29:29.859731] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:52.677 [2024-07-15 10:29:29.859752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:52.677 [2024-07-15 10:29:29.861142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:52.677 [2024-07-15 10:29:29.861183] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.936 10:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.936 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.936 "name": "Existed_Raid", 00:21:52.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.936 "strip_size_kb": 0, 00:21:52.936 "state": "configuring", 00:21:52.937 "raid_level": "raid1", 00:21:52.937 "superblock": false, 00:21:52.937 "num_base_bdevs": 4, 00:21:52.937 "num_base_bdevs_discovered": 3, 00:21:52.937 "num_base_bdevs_operational": 4, 00:21:52.937 "base_bdevs_list": [ 00:21:52.937 { 00:21:52.937 "name": "BaseBdev1", 00:21:52.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.937 "is_configured": false, 00:21:52.937 "data_offset": 0, 00:21:52.937 "data_size": 0 00:21:52.937 }, 00:21:52.937 { 00:21:52.937 "name": "BaseBdev2", 00:21:52.937 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:21:52.937 "is_configured": true, 00:21:52.937 "data_offset": 0, 00:21:52.937 "data_size": 65536 00:21:52.937 }, 00:21:52.937 { 00:21:52.937 "name": "BaseBdev3", 00:21:52.937 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:21:52.937 "is_configured": true, 00:21:52.937 "data_offset": 0, 00:21:52.937 "data_size": 65536 00:21:52.937 }, 00:21:52.937 { 00:21:52.937 "name": "BaseBdev4", 00:21:52.937 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:21:52.937 "is_configured": true, 00:21:52.937 "data_offset": 0, 00:21:52.937 "data_size": 65536 00:21:52.937 } 00:21:52.937 ] 00:21:52.937 }' 00:21:52.937 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.937 10:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:53.869 [2024-07-15 10:29:30.918476] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.869 10:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:54.127 10:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:54.127 "name": "Existed_Raid", 00:21:54.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.127 "strip_size_kb": 0, 00:21:54.127 "state": "configuring", 00:21:54.127 "raid_level": "raid1", 00:21:54.127 "superblock": false, 00:21:54.127 "num_base_bdevs": 4, 00:21:54.127 "num_base_bdevs_discovered": 2, 00:21:54.127 "num_base_bdevs_operational": 4, 00:21:54.127 "base_bdevs_list": [ 00:21:54.127 { 00:21:54.127 "name": "BaseBdev1", 00:21:54.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:54.127 "is_configured": false, 00:21:54.127 "data_offset": 0, 00:21:54.127 "data_size": 0 00:21:54.127 }, 00:21:54.127 { 00:21:54.127 "name": null, 00:21:54.127 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:21:54.127 "is_configured": false, 00:21:54.127 "data_offset": 0, 00:21:54.127 "data_size": 65536 00:21:54.127 }, 00:21:54.127 { 00:21:54.127 "name": "BaseBdev3", 00:21:54.127 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:21:54.127 "is_configured": true, 00:21:54.127 "data_offset": 0, 00:21:54.127 "data_size": 65536 00:21:54.127 }, 00:21:54.127 { 00:21:54.127 "name": "BaseBdev4", 00:21:54.127 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:21:54.127 "is_configured": true, 00:21:54.127 "data_offset": 0, 00:21:54.127 "data_size": 65536 00:21:54.127 } 00:21:54.127 ] 00:21:54.127 }' 00:21:54.127 10:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:54.127 10:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.691 10:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:54.691 10:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.948 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:54.948 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:55.206 [2024-07-15 10:29:32.285482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:55.206 BaseBdev1 00:21:55.206 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:55.206 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:55.206 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:55.206 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:55.206 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:55.206 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:55.206 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:55.464 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:55.720 [ 00:21:55.720 { 00:21:55.720 "name": "BaseBdev1", 00:21:55.720 "aliases": [ 00:21:55.720 "70af587f-0f99-41f5-8d47-52d08d1f270c" 00:21:55.720 ], 00:21:55.720 "product_name": "Malloc disk", 00:21:55.720 "block_size": 512, 00:21:55.720 "num_blocks": 65536, 00:21:55.720 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:21:55.720 "assigned_rate_limits": { 00:21:55.720 "rw_ios_per_sec": 0, 00:21:55.720 "rw_mbytes_per_sec": 0, 00:21:55.720 "r_mbytes_per_sec": 0, 00:21:55.720 "w_mbytes_per_sec": 0 00:21:55.720 }, 00:21:55.720 "claimed": true, 00:21:55.720 "claim_type": "exclusive_write", 00:21:55.720 "zoned": false, 00:21:55.720 "supported_io_types": { 00:21:55.720 "read": true, 00:21:55.720 "write": true, 00:21:55.720 "unmap": true, 00:21:55.720 "flush": true, 00:21:55.720 "reset": true, 00:21:55.720 "nvme_admin": false, 00:21:55.720 "nvme_io": false, 00:21:55.720 "nvme_io_md": false, 00:21:55.720 "write_zeroes": true, 00:21:55.720 "zcopy": true, 00:21:55.720 "get_zone_info": false, 00:21:55.720 "zone_management": false, 00:21:55.720 "zone_append": false, 00:21:55.720 "compare": false, 00:21:55.720 "compare_and_write": false, 00:21:55.720 "abort": true, 00:21:55.720 "seek_hole": false, 00:21:55.720 "seek_data": false, 00:21:55.720 "copy": true, 00:21:55.720 "nvme_iov_md": false 00:21:55.720 }, 00:21:55.720 "memory_domains": [ 00:21:55.720 { 00:21:55.720 "dma_device_id": "system", 00:21:55.720 "dma_device_type": 1 00:21:55.720 }, 00:21:55.720 { 00:21:55.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.720 "dma_device_type": 2 00:21:55.720 } 00:21:55.720 ], 00:21:55.720 "driver_specific": {} 00:21:55.720 } 00:21:55.720 ] 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.720 10:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:55.978 10:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.978 "name": "Existed_Raid", 00:21:55.978 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:55.978 "strip_size_kb": 0, 00:21:55.978 "state": "configuring", 00:21:55.978 "raid_level": "raid1", 00:21:55.978 "superblock": false, 00:21:55.978 "num_base_bdevs": 4, 00:21:55.978 "num_base_bdevs_discovered": 3, 00:21:55.978 "num_base_bdevs_operational": 4, 00:21:55.978 "base_bdevs_list": [ 00:21:55.978 { 00:21:55.978 "name": "BaseBdev1", 00:21:55.978 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:21:55.978 "is_configured": true, 00:21:55.978 "data_offset": 0, 00:21:55.978 "data_size": 65536 00:21:55.978 }, 00:21:55.978 { 00:21:55.978 "name": null, 00:21:55.978 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:21:55.978 "is_configured": false, 00:21:55.978 "data_offset": 0, 00:21:55.978 "data_size": 65536 00:21:55.978 }, 00:21:55.978 { 00:21:55.978 "name": "BaseBdev3", 00:21:55.978 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:21:55.978 "is_configured": true, 00:21:55.978 "data_offset": 0, 00:21:55.978 "data_size": 65536 00:21:55.978 }, 00:21:55.978 { 00:21:55.978 "name": "BaseBdev4", 00:21:55.978 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:21:55.978 "is_configured": true, 00:21:55.978 "data_offset": 0, 00:21:55.978 "data_size": 65536 00:21:55.978 } 00:21:55.978 ] 00:21:55.978 }' 00:21:55.978 10:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.978 10:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:56.541 10:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.541 10:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:56.799 10:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:56.799 10:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:57.057 [2024-07-15 10:29:34.094327] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.057 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:57.316 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.316 "name": "Existed_Raid", 00:21:57.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:57.316 "strip_size_kb": 0, 00:21:57.316 "state": "configuring", 00:21:57.316 "raid_level": "raid1", 00:21:57.316 "superblock": false, 00:21:57.316 "num_base_bdevs": 4, 00:21:57.316 "num_base_bdevs_discovered": 2, 00:21:57.316 "num_base_bdevs_operational": 4, 00:21:57.316 "base_bdevs_list": [ 00:21:57.316 { 00:21:57.316 "name": "BaseBdev1", 00:21:57.316 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:21:57.316 "is_configured": true, 00:21:57.316 "data_offset": 0, 00:21:57.316 "data_size": 65536 00:21:57.316 }, 00:21:57.316 { 00:21:57.316 "name": null, 00:21:57.316 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:21:57.316 "is_configured": false, 00:21:57.316 "data_offset": 0, 00:21:57.316 "data_size": 65536 00:21:57.316 }, 00:21:57.316 { 00:21:57.316 "name": null, 00:21:57.316 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:21:57.316 "is_configured": false, 00:21:57.316 "data_offset": 0, 00:21:57.316 "data_size": 65536 00:21:57.316 }, 00:21:57.316 { 00:21:57.316 "name": "BaseBdev4", 00:21:57.316 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:21:57.316 "is_configured": true, 00:21:57.316 "data_offset": 0, 00:21:57.316 "data_size": 65536 00:21:57.316 } 00:21:57.316 ] 00:21:57.316 }' 00:21:57.316 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.316 10:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:57.882 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.882 10:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:58.143 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:58.143 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:58.400 [2024-07-15 10:29:35.357845] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.400 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:58.658 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.658 "name": "Existed_Raid", 00:21:58.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.658 "strip_size_kb": 0, 00:21:58.658 "state": "configuring", 00:21:58.658 "raid_level": "raid1", 00:21:58.658 "superblock": false, 00:21:58.658 "num_base_bdevs": 4, 00:21:58.658 "num_base_bdevs_discovered": 3, 00:21:58.658 "num_base_bdevs_operational": 4, 00:21:58.658 "base_bdevs_list": [ 00:21:58.658 { 00:21:58.658 "name": "BaseBdev1", 00:21:58.658 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:21:58.658 "is_configured": true, 00:21:58.658 "data_offset": 0, 00:21:58.658 "data_size": 65536 00:21:58.658 }, 00:21:58.658 { 00:21:58.658 "name": null, 00:21:58.658 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:21:58.658 "is_configured": false, 00:21:58.658 "data_offset": 0, 00:21:58.658 "data_size": 65536 00:21:58.658 }, 00:21:58.658 { 00:21:58.658 "name": "BaseBdev3", 00:21:58.658 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:21:58.658 "is_configured": true, 00:21:58.658 "data_offset": 0, 00:21:58.658 "data_size": 65536 00:21:58.658 }, 00:21:58.658 { 00:21:58.658 "name": "BaseBdev4", 00:21:58.658 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:21:58.658 "is_configured": true, 00:21:58.658 "data_offset": 0, 00:21:58.658 "data_size": 65536 00:21:58.658 } 00:21:58.658 ] 00:21:58.658 }' 00:21:58.658 10:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.658 10:29:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.223 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:59.223 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.481 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:59.481 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:59.739 [2024-07-15 10:29:36.701446] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.739 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:59.997 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.997 "name": "Existed_Raid", 00:21:59.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:59.997 "strip_size_kb": 0, 00:21:59.997 "state": "configuring", 00:21:59.997 "raid_level": "raid1", 00:21:59.997 "superblock": false, 00:21:59.997 "num_base_bdevs": 4, 00:21:59.997 "num_base_bdevs_discovered": 2, 00:21:59.997 "num_base_bdevs_operational": 4, 00:21:59.997 "base_bdevs_list": [ 00:21:59.997 { 00:21:59.997 "name": null, 00:21:59.997 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:21:59.997 "is_configured": false, 00:21:59.997 "data_offset": 0, 00:21:59.997 "data_size": 65536 00:21:59.997 }, 00:21:59.997 { 00:21:59.997 "name": null, 00:21:59.997 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:21:59.997 "is_configured": false, 00:21:59.997 "data_offset": 0, 00:21:59.997 "data_size": 65536 00:21:59.997 }, 00:21:59.997 { 00:21:59.997 "name": "BaseBdev3", 00:21:59.997 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:21:59.997 "is_configured": true, 00:21:59.997 "data_offset": 0, 00:21:59.997 "data_size": 65536 00:21:59.997 }, 00:21:59.997 { 00:21:59.997 "name": "BaseBdev4", 00:21:59.997 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:21:59.997 "is_configured": true, 00:21:59.997 "data_offset": 0, 00:21:59.997 "data_size": 65536 00:21:59.997 } 00:21:59.997 ] 00:21:59.997 }' 00:21:59.997 10:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.997 10:29:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:00.561 10:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.561 10:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:00.818 10:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:00.818 10:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:01.076 [2024-07-15 10:29:38.069735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.076 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:01.333 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:01.333 "name": "Existed_Raid", 00:22:01.333 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.333 "strip_size_kb": 0, 00:22:01.333 "state": "configuring", 00:22:01.333 "raid_level": "raid1", 00:22:01.333 "superblock": false, 00:22:01.333 "num_base_bdevs": 4, 00:22:01.333 "num_base_bdevs_discovered": 3, 00:22:01.333 "num_base_bdevs_operational": 4, 00:22:01.333 "base_bdevs_list": [ 00:22:01.333 { 00:22:01.333 "name": null, 00:22:01.333 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:22:01.333 "is_configured": false, 00:22:01.333 "data_offset": 0, 00:22:01.333 "data_size": 65536 00:22:01.333 }, 00:22:01.333 { 00:22:01.333 "name": "BaseBdev2", 00:22:01.333 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:22:01.333 "is_configured": true, 00:22:01.333 "data_offset": 0, 00:22:01.333 "data_size": 65536 00:22:01.333 }, 00:22:01.333 { 00:22:01.333 "name": "BaseBdev3", 00:22:01.333 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:22:01.333 "is_configured": true, 00:22:01.333 "data_offset": 0, 00:22:01.333 "data_size": 65536 00:22:01.333 }, 00:22:01.333 { 00:22:01.333 "name": "BaseBdev4", 00:22:01.333 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:22:01.333 "is_configured": true, 00:22:01.333 "data_offset": 0, 00:22:01.333 "data_size": 65536 00:22:01.333 } 00:22:01.333 ] 00:22:01.333 }' 00:22:01.333 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:01.333 10:29:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.898 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.898 10:29:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:01.898 10:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:01.898 10:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.898 10:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:02.156 10:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 70af587f-0f99-41f5-8d47-52d08d1f270c 00:22:02.431 [2024-07-15 10:29:39.537638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:02.431 [2024-07-15 10:29:39.537682] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16e8610 00:22:02.431 [2024-07-15 10:29:39.537691] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:02.431 [2024-07-15 10:29:39.537886] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16e9a70 00:22:02.431 [2024-07-15 10:29:39.538023] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16e8610 00:22:02.431 [2024-07-15 10:29:39.538040] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x16e8610 00:22:02.431 [2024-07-15 10:29:39.538203] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:02.431 NewBaseBdev 00:22:02.431 10:29:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:02.431 10:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:02.431 10:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:02.431 10:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:02.431 10:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:02.431 10:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:02.431 10:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:02.689 10:29:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:02.946 [ 00:22:02.946 { 00:22:02.946 "name": "NewBaseBdev", 00:22:02.946 "aliases": [ 00:22:02.946 "70af587f-0f99-41f5-8d47-52d08d1f270c" 00:22:02.946 ], 00:22:02.946 "product_name": "Malloc disk", 00:22:02.946 "block_size": 512, 00:22:02.946 "num_blocks": 65536, 00:22:02.946 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:22:02.946 "assigned_rate_limits": { 00:22:02.946 "rw_ios_per_sec": 0, 00:22:02.946 "rw_mbytes_per_sec": 0, 00:22:02.946 "r_mbytes_per_sec": 0, 00:22:02.946 "w_mbytes_per_sec": 0 00:22:02.946 }, 00:22:02.946 "claimed": true, 00:22:02.946 "claim_type": "exclusive_write", 00:22:02.946 "zoned": false, 00:22:02.946 "supported_io_types": { 00:22:02.946 "read": true, 00:22:02.946 "write": true, 00:22:02.946 "unmap": true, 00:22:02.946 "flush": true, 00:22:02.946 "reset": true, 00:22:02.946 "nvme_admin": false, 00:22:02.946 "nvme_io": false, 00:22:02.946 "nvme_io_md": false, 00:22:02.946 "write_zeroes": true, 00:22:02.946 "zcopy": true, 00:22:02.946 "get_zone_info": false, 00:22:02.946 "zone_management": false, 00:22:02.946 "zone_append": false, 00:22:02.946 "compare": false, 00:22:02.946 "compare_and_write": false, 00:22:02.946 "abort": true, 00:22:02.946 "seek_hole": false, 00:22:02.946 "seek_data": false, 00:22:02.946 "copy": true, 00:22:02.946 "nvme_iov_md": false 00:22:02.946 }, 00:22:02.946 "memory_domains": [ 00:22:02.946 { 00:22:02.946 "dma_device_id": "system", 00:22:02.946 "dma_device_type": 1 00:22:02.946 }, 00:22:02.946 { 00:22:02.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.946 "dma_device_type": 2 00:22:02.946 } 00:22:02.946 ], 00:22:02.946 "driver_specific": {} 00:22:02.946 } 00:22:02.946 ] 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.946 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:03.204 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:03.204 "name": "Existed_Raid", 00:22:03.204 "uuid": "7013ddfe-1e50-4fc9-ba41-a2a8e579c272", 00:22:03.204 "strip_size_kb": 0, 00:22:03.204 "state": "online", 00:22:03.204 "raid_level": "raid1", 00:22:03.204 "superblock": false, 00:22:03.204 "num_base_bdevs": 4, 00:22:03.204 "num_base_bdevs_discovered": 4, 00:22:03.204 "num_base_bdevs_operational": 4, 00:22:03.204 "base_bdevs_list": [ 00:22:03.204 { 00:22:03.204 "name": "NewBaseBdev", 00:22:03.204 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:22:03.204 "is_configured": true, 00:22:03.204 "data_offset": 0, 00:22:03.204 "data_size": 65536 00:22:03.204 }, 00:22:03.204 { 00:22:03.204 "name": "BaseBdev2", 00:22:03.204 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:22:03.204 "is_configured": true, 00:22:03.204 "data_offset": 0, 00:22:03.204 "data_size": 65536 00:22:03.204 }, 00:22:03.204 { 00:22:03.204 "name": "BaseBdev3", 00:22:03.204 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:22:03.204 "is_configured": true, 00:22:03.204 "data_offset": 0, 00:22:03.204 "data_size": 65536 00:22:03.204 }, 00:22:03.204 { 00:22:03.204 "name": "BaseBdev4", 00:22:03.204 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:22:03.204 "is_configured": true, 00:22:03.204 "data_offset": 0, 00:22:03.204 "data_size": 65536 00:22:03.204 } 00:22:03.204 ] 00:22:03.204 }' 00:22:03.204 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:03.204 10:29:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:03.769 10:29:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:04.028 [2024-07-15 10:29:41.106203] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:04.028 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:04.028 "name": "Existed_Raid", 00:22:04.028 "aliases": [ 00:22:04.028 "7013ddfe-1e50-4fc9-ba41-a2a8e579c272" 00:22:04.028 ], 00:22:04.028 "product_name": "Raid Volume", 00:22:04.028 "block_size": 512, 00:22:04.028 "num_blocks": 65536, 00:22:04.028 "uuid": "7013ddfe-1e50-4fc9-ba41-a2a8e579c272", 00:22:04.028 "assigned_rate_limits": { 00:22:04.028 "rw_ios_per_sec": 0, 00:22:04.028 "rw_mbytes_per_sec": 0, 00:22:04.028 "r_mbytes_per_sec": 0, 00:22:04.028 "w_mbytes_per_sec": 0 00:22:04.028 }, 00:22:04.028 "claimed": false, 00:22:04.028 "zoned": false, 00:22:04.028 "supported_io_types": { 00:22:04.028 "read": true, 00:22:04.028 "write": true, 00:22:04.028 "unmap": false, 00:22:04.028 "flush": false, 00:22:04.028 "reset": true, 00:22:04.028 "nvme_admin": false, 00:22:04.028 "nvme_io": false, 00:22:04.028 "nvme_io_md": false, 00:22:04.028 "write_zeroes": true, 00:22:04.028 "zcopy": false, 00:22:04.028 "get_zone_info": false, 00:22:04.028 "zone_management": false, 00:22:04.028 "zone_append": false, 00:22:04.028 "compare": false, 00:22:04.028 "compare_and_write": false, 00:22:04.028 "abort": false, 00:22:04.028 "seek_hole": false, 00:22:04.028 "seek_data": false, 00:22:04.028 "copy": false, 00:22:04.028 "nvme_iov_md": false 00:22:04.028 }, 00:22:04.028 "memory_domains": [ 00:22:04.028 { 00:22:04.028 "dma_device_id": "system", 00:22:04.028 "dma_device_type": 1 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.028 "dma_device_type": 2 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "dma_device_id": "system", 00:22:04.028 "dma_device_type": 1 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.028 "dma_device_type": 2 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "dma_device_id": "system", 00:22:04.028 "dma_device_type": 1 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.028 "dma_device_type": 2 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "dma_device_id": "system", 00:22:04.028 "dma_device_type": 1 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.028 "dma_device_type": 2 00:22:04.028 } 00:22:04.028 ], 00:22:04.028 "driver_specific": { 00:22:04.028 "raid": { 00:22:04.028 "uuid": "7013ddfe-1e50-4fc9-ba41-a2a8e579c272", 00:22:04.028 "strip_size_kb": 0, 00:22:04.028 "state": "online", 00:22:04.028 "raid_level": "raid1", 00:22:04.028 "superblock": false, 00:22:04.028 "num_base_bdevs": 4, 00:22:04.028 "num_base_bdevs_discovered": 4, 00:22:04.028 "num_base_bdevs_operational": 4, 00:22:04.028 "base_bdevs_list": [ 00:22:04.028 { 00:22:04.028 "name": "NewBaseBdev", 00:22:04.028 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:22:04.028 "is_configured": true, 00:22:04.028 "data_offset": 0, 00:22:04.028 "data_size": 65536 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "name": "BaseBdev2", 00:22:04.028 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:22:04.028 "is_configured": true, 00:22:04.028 "data_offset": 0, 00:22:04.028 "data_size": 65536 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "name": "BaseBdev3", 00:22:04.028 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:22:04.028 "is_configured": true, 00:22:04.028 "data_offset": 0, 00:22:04.028 "data_size": 65536 00:22:04.028 }, 00:22:04.028 { 00:22:04.028 "name": "BaseBdev4", 00:22:04.029 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:22:04.029 "is_configured": true, 00:22:04.029 "data_offset": 0, 00:22:04.029 "data_size": 65536 00:22:04.029 } 00:22:04.029 ] 00:22:04.029 } 00:22:04.029 } 00:22:04.029 }' 00:22:04.029 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:04.029 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:04.029 BaseBdev2 00:22:04.029 BaseBdev3 00:22:04.029 BaseBdev4' 00:22:04.029 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.029 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:04.029 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:04.286 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:04.286 "name": "NewBaseBdev", 00:22:04.286 "aliases": [ 00:22:04.286 "70af587f-0f99-41f5-8d47-52d08d1f270c" 00:22:04.286 ], 00:22:04.286 "product_name": "Malloc disk", 00:22:04.286 "block_size": 512, 00:22:04.286 "num_blocks": 65536, 00:22:04.287 "uuid": "70af587f-0f99-41f5-8d47-52d08d1f270c", 00:22:04.287 "assigned_rate_limits": { 00:22:04.287 "rw_ios_per_sec": 0, 00:22:04.287 "rw_mbytes_per_sec": 0, 00:22:04.287 "r_mbytes_per_sec": 0, 00:22:04.287 "w_mbytes_per_sec": 0 00:22:04.287 }, 00:22:04.287 "claimed": true, 00:22:04.287 "claim_type": "exclusive_write", 00:22:04.287 "zoned": false, 00:22:04.287 "supported_io_types": { 00:22:04.287 "read": true, 00:22:04.287 "write": true, 00:22:04.287 "unmap": true, 00:22:04.287 "flush": true, 00:22:04.287 "reset": true, 00:22:04.287 "nvme_admin": false, 00:22:04.287 "nvme_io": false, 00:22:04.287 "nvme_io_md": false, 00:22:04.287 "write_zeroes": true, 00:22:04.287 "zcopy": true, 00:22:04.287 "get_zone_info": false, 00:22:04.287 "zone_management": false, 00:22:04.287 "zone_append": false, 00:22:04.287 "compare": false, 00:22:04.287 "compare_and_write": false, 00:22:04.287 "abort": true, 00:22:04.287 "seek_hole": false, 00:22:04.287 "seek_data": false, 00:22:04.287 "copy": true, 00:22:04.287 "nvme_iov_md": false 00:22:04.287 }, 00:22:04.287 "memory_domains": [ 00:22:04.287 { 00:22:04.287 "dma_device_id": "system", 00:22:04.287 "dma_device_type": 1 00:22:04.287 }, 00:22:04.287 { 00:22:04.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.287 "dma_device_type": 2 00:22:04.287 } 00:22:04.287 ], 00:22:04.287 "driver_specific": {} 00:22:04.287 }' 00:22:04.287 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.287 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:04.545 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:04.804 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:04.804 "name": "BaseBdev2", 00:22:04.804 "aliases": [ 00:22:04.804 "ff2e39c4-ad9c-458e-b049-0311d85b911c" 00:22:04.804 ], 00:22:04.804 "product_name": "Malloc disk", 00:22:04.804 "block_size": 512, 00:22:04.804 "num_blocks": 65536, 00:22:04.804 "uuid": "ff2e39c4-ad9c-458e-b049-0311d85b911c", 00:22:04.804 "assigned_rate_limits": { 00:22:04.804 "rw_ios_per_sec": 0, 00:22:04.804 "rw_mbytes_per_sec": 0, 00:22:04.804 "r_mbytes_per_sec": 0, 00:22:04.804 "w_mbytes_per_sec": 0 00:22:04.804 }, 00:22:04.804 "claimed": true, 00:22:04.804 "claim_type": "exclusive_write", 00:22:04.804 "zoned": false, 00:22:04.804 "supported_io_types": { 00:22:04.804 "read": true, 00:22:04.804 "write": true, 00:22:04.804 "unmap": true, 00:22:04.804 "flush": true, 00:22:04.804 "reset": true, 00:22:04.804 "nvme_admin": false, 00:22:04.804 "nvme_io": false, 00:22:04.804 "nvme_io_md": false, 00:22:04.804 "write_zeroes": true, 00:22:04.804 "zcopy": true, 00:22:04.804 "get_zone_info": false, 00:22:04.804 "zone_management": false, 00:22:04.804 "zone_append": false, 00:22:04.804 "compare": false, 00:22:04.804 "compare_and_write": false, 00:22:04.804 "abort": true, 00:22:04.804 "seek_hole": false, 00:22:04.804 "seek_data": false, 00:22:04.804 "copy": true, 00:22:04.804 "nvme_iov_md": false 00:22:04.804 }, 00:22:04.804 "memory_domains": [ 00:22:04.804 { 00:22:04.804 "dma_device_id": "system", 00:22:04.804 "dma_device_type": 1 00:22:04.804 }, 00:22:04.804 { 00:22:04.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:04.804 "dma_device_type": 2 00:22:04.804 } 00:22:04.804 ], 00:22:04.804 "driver_specific": {} 00:22:04.804 }' 00:22:04.804 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:04.804 10:29:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.063 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.322 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:05.322 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:05.322 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:05.322 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:05.587 "name": "BaseBdev3", 00:22:05.587 "aliases": [ 00:22:05.587 "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9" 00:22:05.587 ], 00:22:05.587 "product_name": "Malloc disk", 00:22:05.587 "block_size": 512, 00:22:05.587 "num_blocks": 65536, 00:22:05.587 "uuid": "3d59f9b1-a3ea-46ab-996f-ac0bae53f1d9", 00:22:05.587 "assigned_rate_limits": { 00:22:05.587 "rw_ios_per_sec": 0, 00:22:05.587 "rw_mbytes_per_sec": 0, 00:22:05.587 "r_mbytes_per_sec": 0, 00:22:05.587 "w_mbytes_per_sec": 0 00:22:05.587 }, 00:22:05.587 "claimed": true, 00:22:05.587 "claim_type": "exclusive_write", 00:22:05.587 "zoned": false, 00:22:05.587 "supported_io_types": { 00:22:05.587 "read": true, 00:22:05.587 "write": true, 00:22:05.587 "unmap": true, 00:22:05.587 "flush": true, 00:22:05.587 "reset": true, 00:22:05.587 "nvme_admin": false, 00:22:05.587 "nvme_io": false, 00:22:05.587 "nvme_io_md": false, 00:22:05.587 "write_zeroes": true, 00:22:05.587 "zcopy": true, 00:22:05.587 "get_zone_info": false, 00:22:05.587 "zone_management": false, 00:22:05.587 "zone_append": false, 00:22:05.587 "compare": false, 00:22:05.587 "compare_and_write": false, 00:22:05.587 "abort": true, 00:22:05.587 "seek_hole": false, 00:22:05.587 "seek_data": false, 00:22:05.587 "copy": true, 00:22:05.587 "nvme_iov_md": false 00:22:05.587 }, 00:22:05.587 "memory_domains": [ 00:22:05.587 { 00:22:05.587 "dma_device_id": "system", 00:22:05.587 "dma_device_type": 1 00:22:05.587 }, 00:22:05.587 { 00:22:05.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.587 "dma_device_type": 2 00:22:05.587 } 00:22:05.587 ], 00:22:05.587 "driver_specific": {} 00:22:05.587 }' 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.587 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.848 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.848 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.848 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:05.848 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:05.848 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:05.848 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:05.848 10:29:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:06.106 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:06.106 "name": "BaseBdev4", 00:22:06.106 "aliases": [ 00:22:06.106 "8e356111-bba5-40d1-b8ed-86171b899474" 00:22:06.106 ], 00:22:06.106 "product_name": "Malloc disk", 00:22:06.106 "block_size": 512, 00:22:06.106 "num_blocks": 65536, 00:22:06.106 "uuid": "8e356111-bba5-40d1-b8ed-86171b899474", 00:22:06.106 "assigned_rate_limits": { 00:22:06.106 "rw_ios_per_sec": 0, 00:22:06.106 "rw_mbytes_per_sec": 0, 00:22:06.106 "r_mbytes_per_sec": 0, 00:22:06.106 "w_mbytes_per_sec": 0 00:22:06.106 }, 00:22:06.106 "claimed": true, 00:22:06.106 "claim_type": "exclusive_write", 00:22:06.106 "zoned": false, 00:22:06.106 "supported_io_types": { 00:22:06.106 "read": true, 00:22:06.106 "write": true, 00:22:06.106 "unmap": true, 00:22:06.106 "flush": true, 00:22:06.106 "reset": true, 00:22:06.106 "nvme_admin": false, 00:22:06.106 "nvme_io": false, 00:22:06.106 "nvme_io_md": false, 00:22:06.106 "write_zeroes": true, 00:22:06.106 "zcopy": true, 00:22:06.106 "get_zone_info": false, 00:22:06.106 "zone_management": false, 00:22:06.106 "zone_append": false, 00:22:06.106 "compare": false, 00:22:06.106 "compare_and_write": false, 00:22:06.106 "abort": true, 00:22:06.106 "seek_hole": false, 00:22:06.106 "seek_data": false, 00:22:06.106 "copy": true, 00:22:06.106 "nvme_iov_md": false 00:22:06.106 }, 00:22:06.106 "memory_domains": [ 00:22:06.106 { 00:22:06.106 "dma_device_id": "system", 00:22:06.106 "dma_device_type": 1 00:22:06.106 }, 00:22:06.106 { 00:22:06.106 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.106 "dma_device_type": 2 00:22:06.106 } 00:22:06.106 ], 00:22:06.106 "driver_specific": {} 00:22:06.106 }' 00:22:06.106 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.106 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.106 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:06.106 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.106 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.106 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:06.364 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.364 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.364 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:06.364 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.364 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.364 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:06.364 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:06.621 [2024-07-15 10:29:43.704772] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:06.621 [2024-07-15 10:29:43.704796] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:06.621 [2024-07-15 10:29:43.704853] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:06.621 [2024-07-15 10:29:43.705131] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:06.621 [2024-07-15 10:29:43.705145] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16e8610 name Existed_Raid, state offline 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 564337 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 564337 ']' 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 564337 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 564337 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 564337' 00:22:06.621 killing process with pid 564337 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 564337 00:22:06.621 [2024-07-15 10:29:43.801163] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:06.621 10:29:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 564337 00:22:06.879 [2024-07-15 10:29:43.837768] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:06.879 10:29:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:22:06.879 00:22:06.879 real 0m31.462s 00:22:06.879 user 0m57.707s 00:22:06.879 sys 0m5.683s 00:22:06.879 10:29:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:06.879 10:29:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:06.879 ************************************ 00:22:06.879 END TEST raid_state_function_test 00:22:06.879 ************************************ 00:22:07.137 10:29:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:07.137 10:29:44 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:22:07.137 10:29:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:07.137 10:29:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:07.137 10:29:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:07.137 ************************************ 00:22:07.137 START TEST raid_state_function_test_sb 00:22:07.137 ************************************ 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=569049 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 569049' 00:22:07.137 Process raid pid: 569049 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 569049 /var/tmp/spdk-raid.sock 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 569049 ']' 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:07.137 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:07.137 10:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:07.137 [2024-07-15 10:29:44.189862] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:22:07.137 [2024-07-15 10:29:44.189934] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:07.137 [2024-07-15 10:29:44.318942] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:07.396 [2024-07-15 10:29:44.425305] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:07.396 [2024-07-15 10:29:44.493620] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:07.396 [2024-07-15 10:29:44.493654] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:07.962 10:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:07.962 10:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:07.962 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:08.220 [2024-07-15 10:29:45.328802] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:08.220 [2024-07-15 10:29:45.328848] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:08.220 [2024-07-15 10:29:45.328859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:08.220 [2024-07-15 10:29:45.328871] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:08.220 [2024-07-15 10:29:45.328880] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:08.220 [2024-07-15 10:29:45.328891] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:08.220 [2024-07-15 10:29:45.328899] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:08.220 [2024-07-15 10:29:45.328910] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.220 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:08.478 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.478 "name": "Existed_Raid", 00:22:08.478 "uuid": "c02f08cd-5bf5-4c5f-828e-894a65885948", 00:22:08.478 "strip_size_kb": 0, 00:22:08.478 "state": "configuring", 00:22:08.478 "raid_level": "raid1", 00:22:08.478 "superblock": true, 00:22:08.478 "num_base_bdevs": 4, 00:22:08.478 "num_base_bdevs_discovered": 0, 00:22:08.478 "num_base_bdevs_operational": 4, 00:22:08.478 "base_bdevs_list": [ 00:22:08.478 { 00:22:08.478 "name": "BaseBdev1", 00:22:08.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.478 "is_configured": false, 00:22:08.478 "data_offset": 0, 00:22:08.478 "data_size": 0 00:22:08.478 }, 00:22:08.478 { 00:22:08.478 "name": "BaseBdev2", 00:22:08.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.478 "is_configured": false, 00:22:08.478 "data_offset": 0, 00:22:08.478 "data_size": 0 00:22:08.478 }, 00:22:08.478 { 00:22:08.478 "name": "BaseBdev3", 00:22:08.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.478 "is_configured": false, 00:22:08.478 "data_offset": 0, 00:22:08.478 "data_size": 0 00:22:08.478 }, 00:22:08.478 { 00:22:08.478 "name": "BaseBdev4", 00:22:08.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.478 "is_configured": false, 00:22:08.478 "data_offset": 0, 00:22:08.478 "data_size": 0 00:22:08.478 } 00:22:08.478 ] 00:22:08.478 }' 00:22:08.478 10:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.478 10:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:09.044 10:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:09.302 [2024-07-15 10:29:46.327302] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:09.302 [2024-07-15 10:29:46.327336] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1593aa0 name Existed_Raid, state configuring 00:22:09.302 10:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:09.561 [2024-07-15 10:29:46.571979] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:09.561 [2024-07-15 10:29:46.572023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:09.561 [2024-07-15 10:29:46.572033] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:09.561 [2024-07-15 10:29:46.572045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:09.561 [2024-07-15 10:29:46.572054] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:09.561 [2024-07-15 10:29:46.572065] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:09.561 [2024-07-15 10:29:46.572074] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:09.561 [2024-07-15 10:29:46.572085] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:09.561 10:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:09.820 [2024-07-15 10:29:46.822391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:09.820 BaseBdev1 00:22:09.820 10:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:09.820 10:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:09.820 10:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:09.820 10:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:09.820 10:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:09.820 10:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:09.820 10:29:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:10.078 10:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:10.336 [ 00:22:10.336 { 00:22:10.336 "name": "BaseBdev1", 00:22:10.336 "aliases": [ 00:22:10.336 "87a12649-4079-4fd2-ba6e-b9372bc700b6" 00:22:10.336 ], 00:22:10.336 "product_name": "Malloc disk", 00:22:10.336 "block_size": 512, 00:22:10.336 "num_blocks": 65536, 00:22:10.336 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:10.336 "assigned_rate_limits": { 00:22:10.336 "rw_ios_per_sec": 0, 00:22:10.336 "rw_mbytes_per_sec": 0, 00:22:10.336 "r_mbytes_per_sec": 0, 00:22:10.336 "w_mbytes_per_sec": 0 00:22:10.336 }, 00:22:10.336 "claimed": true, 00:22:10.336 "claim_type": "exclusive_write", 00:22:10.336 "zoned": false, 00:22:10.336 "supported_io_types": { 00:22:10.336 "read": true, 00:22:10.336 "write": true, 00:22:10.336 "unmap": true, 00:22:10.336 "flush": true, 00:22:10.336 "reset": true, 00:22:10.336 "nvme_admin": false, 00:22:10.336 "nvme_io": false, 00:22:10.336 "nvme_io_md": false, 00:22:10.336 "write_zeroes": true, 00:22:10.336 "zcopy": true, 00:22:10.336 "get_zone_info": false, 00:22:10.336 "zone_management": false, 00:22:10.336 "zone_append": false, 00:22:10.336 "compare": false, 00:22:10.336 "compare_and_write": false, 00:22:10.336 "abort": true, 00:22:10.336 "seek_hole": false, 00:22:10.336 "seek_data": false, 00:22:10.336 "copy": true, 00:22:10.336 "nvme_iov_md": false 00:22:10.336 }, 00:22:10.336 "memory_domains": [ 00:22:10.336 { 00:22:10.336 "dma_device_id": "system", 00:22:10.336 "dma_device_type": 1 00:22:10.336 }, 00:22:10.336 { 00:22:10.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:10.336 "dma_device_type": 2 00:22:10.336 } 00:22:10.336 ], 00:22:10.336 "driver_specific": {} 00:22:10.336 } 00:22:10.336 ] 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.336 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:10.594 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.594 "name": "Existed_Raid", 00:22:10.594 "uuid": "19773770-9fe5-4d28-a966-164b95e00d71", 00:22:10.594 "strip_size_kb": 0, 00:22:10.594 "state": "configuring", 00:22:10.594 "raid_level": "raid1", 00:22:10.594 "superblock": true, 00:22:10.594 "num_base_bdevs": 4, 00:22:10.594 "num_base_bdevs_discovered": 1, 00:22:10.594 "num_base_bdevs_operational": 4, 00:22:10.594 "base_bdevs_list": [ 00:22:10.594 { 00:22:10.594 "name": "BaseBdev1", 00:22:10.594 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:10.594 "is_configured": true, 00:22:10.594 "data_offset": 2048, 00:22:10.594 "data_size": 63488 00:22:10.594 }, 00:22:10.594 { 00:22:10.594 "name": "BaseBdev2", 00:22:10.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.594 "is_configured": false, 00:22:10.594 "data_offset": 0, 00:22:10.594 "data_size": 0 00:22:10.594 }, 00:22:10.594 { 00:22:10.594 "name": "BaseBdev3", 00:22:10.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.594 "is_configured": false, 00:22:10.594 "data_offset": 0, 00:22:10.594 "data_size": 0 00:22:10.594 }, 00:22:10.594 { 00:22:10.594 "name": "BaseBdev4", 00:22:10.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:10.594 "is_configured": false, 00:22:10.594 "data_offset": 0, 00:22:10.594 "data_size": 0 00:22:10.594 } 00:22:10.594 ] 00:22:10.594 }' 00:22:10.594 10:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.594 10:29:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:11.160 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:11.418 [2024-07-15 10:29:48.394557] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:11.418 [2024-07-15 10:29:48.394595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1593310 name Existed_Raid, state configuring 00:22:11.418 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:11.680 [2024-07-15 10:29:48.643267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:11.680 [2024-07-15 10:29:48.644707] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:11.680 [2024-07-15 10:29:48.644739] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:11.680 [2024-07-15 10:29:48.644750] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:11.680 [2024-07-15 10:29:48.644761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:11.680 [2024-07-15 10:29:48.644770] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:11.680 [2024-07-15 10:29:48.644781] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:11.680 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:11.680 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.681 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:11.939 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:11.939 "name": "Existed_Raid", 00:22:11.939 "uuid": "5ba59be7-ca19-448f-9906-8101f6c2ea92", 00:22:11.939 "strip_size_kb": 0, 00:22:11.939 "state": "configuring", 00:22:11.939 "raid_level": "raid1", 00:22:11.939 "superblock": true, 00:22:11.939 "num_base_bdevs": 4, 00:22:11.939 "num_base_bdevs_discovered": 1, 00:22:11.939 "num_base_bdevs_operational": 4, 00:22:11.939 "base_bdevs_list": [ 00:22:11.939 { 00:22:11.939 "name": "BaseBdev1", 00:22:11.939 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:11.939 "is_configured": true, 00:22:11.939 "data_offset": 2048, 00:22:11.939 "data_size": 63488 00:22:11.939 }, 00:22:11.939 { 00:22:11.939 "name": "BaseBdev2", 00:22:11.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.939 "is_configured": false, 00:22:11.939 "data_offset": 0, 00:22:11.939 "data_size": 0 00:22:11.939 }, 00:22:11.939 { 00:22:11.939 "name": "BaseBdev3", 00:22:11.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.939 "is_configured": false, 00:22:11.939 "data_offset": 0, 00:22:11.939 "data_size": 0 00:22:11.939 }, 00:22:11.939 { 00:22:11.939 "name": "BaseBdev4", 00:22:11.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.939 "is_configured": false, 00:22:11.939 "data_offset": 0, 00:22:11.939 "data_size": 0 00:22:11.939 } 00:22:11.939 ] 00:22:11.939 }' 00:22:11.939 10:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:11.939 10:29:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:12.503 10:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:12.761 [2024-07-15 10:29:49.749607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:12.761 BaseBdev2 00:22:12.761 10:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:12.761 10:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:12.761 10:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:12.761 10:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:12.761 10:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:12.761 10:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:12.761 10:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:13.020 10:29:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:13.278 [ 00:22:13.278 { 00:22:13.278 "name": "BaseBdev2", 00:22:13.278 "aliases": [ 00:22:13.278 "e25a7453-4b48-4250-b031-770dee7ab835" 00:22:13.278 ], 00:22:13.278 "product_name": "Malloc disk", 00:22:13.278 "block_size": 512, 00:22:13.278 "num_blocks": 65536, 00:22:13.278 "uuid": "e25a7453-4b48-4250-b031-770dee7ab835", 00:22:13.278 "assigned_rate_limits": { 00:22:13.278 "rw_ios_per_sec": 0, 00:22:13.278 "rw_mbytes_per_sec": 0, 00:22:13.278 "r_mbytes_per_sec": 0, 00:22:13.278 "w_mbytes_per_sec": 0 00:22:13.278 }, 00:22:13.278 "claimed": true, 00:22:13.278 "claim_type": "exclusive_write", 00:22:13.278 "zoned": false, 00:22:13.278 "supported_io_types": { 00:22:13.278 "read": true, 00:22:13.278 "write": true, 00:22:13.278 "unmap": true, 00:22:13.278 "flush": true, 00:22:13.278 "reset": true, 00:22:13.278 "nvme_admin": false, 00:22:13.278 "nvme_io": false, 00:22:13.278 "nvme_io_md": false, 00:22:13.278 "write_zeroes": true, 00:22:13.278 "zcopy": true, 00:22:13.278 "get_zone_info": false, 00:22:13.278 "zone_management": false, 00:22:13.278 "zone_append": false, 00:22:13.278 "compare": false, 00:22:13.278 "compare_and_write": false, 00:22:13.278 "abort": true, 00:22:13.278 "seek_hole": false, 00:22:13.278 "seek_data": false, 00:22:13.278 "copy": true, 00:22:13.278 "nvme_iov_md": false 00:22:13.278 }, 00:22:13.278 "memory_domains": [ 00:22:13.278 { 00:22:13.278 "dma_device_id": "system", 00:22:13.278 "dma_device_type": 1 00:22:13.278 }, 00:22:13.278 { 00:22:13.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:13.278 "dma_device_type": 2 00:22:13.278 } 00:22:13.278 ], 00:22:13.278 "driver_specific": {} 00:22:13.278 } 00:22:13.278 ] 00:22:13.278 10:29:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:13.278 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:13.278 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:13.278 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:13.278 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:13.278 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:13.278 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.279 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:13.537 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:13.537 "name": "Existed_Raid", 00:22:13.537 "uuid": "5ba59be7-ca19-448f-9906-8101f6c2ea92", 00:22:13.537 "strip_size_kb": 0, 00:22:13.537 "state": "configuring", 00:22:13.537 "raid_level": "raid1", 00:22:13.537 "superblock": true, 00:22:13.537 "num_base_bdevs": 4, 00:22:13.537 "num_base_bdevs_discovered": 2, 00:22:13.537 "num_base_bdevs_operational": 4, 00:22:13.537 "base_bdevs_list": [ 00:22:13.537 { 00:22:13.537 "name": "BaseBdev1", 00:22:13.537 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:13.537 "is_configured": true, 00:22:13.537 "data_offset": 2048, 00:22:13.537 "data_size": 63488 00:22:13.537 }, 00:22:13.537 { 00:22:13.537 "name": "BaseBdev2", 00:22:13.537 "uuid": "e25a7453-4b48-4250-b031-770dee7ab835", 00:22:13.537 "is_configured": true, 00:22:13.537 "data_offset": 2048, 00:22:13.537 "data_size": 63488 00:22:13.537 }, 00:22:13.537 { 00:22:13.537 "name": "BaseBdev3", 00:22:13.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.537 "is_configured": false, 00:22:13.537 "data_offset": 0, 00:22:13.537 "data_size": 0 00:22:13.537 }, 00:22:13.537 { 00:22:13.537 "name": "BaseBdev4", 00:22:13.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:13.537 "is_configured": false, 00:22:13.537 "data_offset": 0, 00:22:13.537 "data_size": 0 00:22:13.537 } 00:22:13.537 ] 00:22:13.537 }' 00:22:13.537 10:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:13.537 10:29:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:14.104 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:14.362 [2024-07-15 10:29:51.309135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:14.362 BaseBdev3 00:22:14.362 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:14.362 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:14.362 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:14.362 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:14.362 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:14.362 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:14.363 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:14.621 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:14.621 [ 00:22:14.621 { 00:22:14.621 "name": "BaseBdev3", 00:22:14.621 "aliases": [ 00:22:14.621 "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f" 00:22:14.621 ], 00:22:14.621 "product_name": "Malloc disk", 00:22:14.621 "block_size": 512, 00:22:14.621 "num_blocks": 65536, 00:22:14.621 "uuid": "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f", 00:22:14.621 "assigned_rate_limits": { 00:22:14.621 "rw_ios_per_sec": 0, 00:22:14.621 "rw_mbytes_per_sec": 0, 00:22:14.621 "r_mbytes_per_sec": 0, 00:22:14.621 "w_mbytes_per_sec": 0 00:22:14.621 }, 00:22:14.621 "claimed": true, 00:22:14.621 "claim_type": "exclusive_write", 00:22:14.621 "zoned": false, 00:22:14.621 "supported_io_types": { 00:22:14.621 "read": true, 00:22:14.621 "write": true, 00:22:14.621 "unmap": true, 00:22:14.621 "flush": true, 00:22:14.621 "reset": true, 00:22:14.621 "nvme_admin": false, 00:22:14.621 "nvme_io": false, 00:22:14.621 "nvme_io_md": false, 00:22:14.621 "write_zeroes": true, 00:22:14.621 "zcopy": true, 00:22:14.621 "get_zone_info": false, 00:22:14.621 "zone_management": false, 00:22:14.621 "zone_append": false, 00:22:14.621 "compare": false, 00:22:14.621 "compare_and_write": false, 00:22:14.621 "abort": true, 00:22:14.621 "seek_hole": false, 00:22:14.621 "seek_data": false, 00:22:14.621 "copy": true, 00:22:14.621 "nvme_iov_md": false 00:22:14.621 }, 00:22:14.621 "memory_domains": [ 00:22:14.621 { 00:22:14.621 "dma_device_id": "system", 00:22:14.621 "dma_device_type": 1 00:22:14.621 }, 00:22:14.621 { 00:22:14.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.621 "dma_device_type": 2 00:22:14.621 } 00:22:14.621 ], 00:22:14.621 "driver_specific": {} 00:22:14.621 } 00:22:14.621 ] 00:22:14.879 10:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:14.879 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:14.879 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:14.879 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:14.879 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:14.879 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.880 10:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:15.138 10:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.138 "name": "Existed_Raid", 00:22:15.138 "uuid": "5ba59be7-ca19-448f-9906-8101f6c2ea92", 00:22:15.138 "strip_size_kb": 0, 00:22:15.138 "state": "configuring", 00:22:15.138 "raid_level": "raid1", 00:22:15.138 "superblock": true, 00:22:15.138 "num_base_bdevs": 4, 00:22:15.138 "num_base_bdevs_discovered": 3, 00:22:15.138 "num_base_bdevs_operational": 4, 00:22:15.138 "base_bdevs_list": [ 00:22:15.138 { 00:22:15.138 "name": "BaseBdev1", 00:22:15.138 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:15.138 "is_configured": true, 00:22:15.138 "data_offset": 2048, 00:22:15.138 "data_size": 63488 00:22:15.138 }, 00:22:15.138 { 00:22:15.138 "name": "BaseBdev2", 00:22:15.138 "uuid": "e25a7453-4b48-4250-b031-770dee7ab835", 00:22:15.138 "is_configured": true, 00:22:15.138 "data_offset": 2048, 00:22:15.138 "data_size": 63488 00:22:15.138 }, 00:22:15.138 { 00:22:15.138 "name": "BaseBdev3", 00:22:15.138 "uuid": "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f", 00:22:15.138 "is_configured": true, 00:22:15.138 "data_offset": 2048, 00:22:15.138 "data_size": 63488 00:22:15.138 }, 00:22:15.138 { 00:22:15.138 "name": "BaseBdev4", 00:22:15.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.138 "is_configured": false, 00:22:15.138 "data_offset": 0, 00:22:15.138 "data_size": 0 00:22:15.138 } 00:22:15.138 ] 00:22:15.138 }' 00:22:15.138 10:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.138 10:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:15.746 10:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:15.746 [2024-07-15 10:29:52.892705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:15.746 [2024-07-15 10:29:52.892877] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1594350 00:22:15.746 [2024-07-15 10:29:52.892891] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:15.746 [2024-07-15 10:29:52.893074] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1594020 00:22:15.746 [2024-07-15 10:29:52.893196] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1594350 00:22:15.746 [2024-07-15 10:29:52.893207] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1594350 00:22:15.746 [2024-07-15 10:29:52.893299] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:15.746 BaseBdev4 00:22:16.011 10:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:16.011 10:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:16.011 10:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:16.011 10:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:16.011 10:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:16.011 10:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:16.011 10:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:16.011 10:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:16.268 [ 00:22:16.268 { 00:22:16.268 "name": "BaseBdev4", 00:22:16.268 "aliases": [ 00:22:16.268 "3b656391-09c5-4b2a-99b7-4ac8c30c6665" 00:22:16.268 ], 00:22:16.268 "product_name": "Malloc disk", 00:22:16.268 "block_size": 512, 00:22:16.268 "num_blocks": 65536, 00:22:16.268 "uuid": "3b656391-09c5-4b2a-99b7-4ac8c30c6665", 00:22:16.268 "assigned_rate_limits": { 00:22:16.268 "rw_ios_per_sec": 0, 00:22:16.268 "rw_mbytes_per_sec": 0, 00:22:16.268 "r_mbytes_per_sec": 0, 00:22:16.268 "w_mbytes_per_sec": 0 00:22:16.268 }, 00:22:16.268 "claimed": true, 00:22:16.268 "claim_type": "exclusive_write", 00:22:16.268 "zoned": false, 00:22:16.268 "supported_io_types": { 00:22:16.268 "read": true, 00:22:16.268 "write": true, 00:22:16.268 "unmap": true, 00:22:16.268 "flush": true, 00:22:16.268 "reset": true, 00:22:16.268 "nvme_admin": false, 00:22:16.268 "nvme_io": false, 00:22:16.268 "nvme_io_md": false, 00:22:16.268 "write_zeroes": true, 00:22:16.268 "zcopy": true, 00:22:16.268 "get_zone_info": false, 00:22:16.268 "zone_management": false, 00:22:16.268 "zone_append": false, 00:22:16.268 "compare": false, 00:22:16.268 "compare_and_write": false, 00:22:16.268 "abort": true, 00:22:16.268 "seek_hole": false, 00:22:16.268 "seek_data": false, 00:22:16.268 "copy": true, 00:22:16.268 "nvme_iov_md": false 00:22:16.268 }, 00:22:16.268 "memory_domains": [ 00:22:16.268 { 00:22:16.268 "dma_device_id": "system", 00:22:16.268 "dma_device_type": 1 00:22:16.268 }, 00:22:16.268 { 00:22:16.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.268 "dma_device_type": 2 00:22:16.268 } 00:22:16.268 ], 00:22:16.268 "driver_specific": {} 00:22:16.268 } 00:22:16.268 ] 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.268 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:16.526 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:16.526 "name": "Existed_Raid", 00:22:16.526 "uuid": "5ba59be7-ca19-448f-9906-8101f6c2ea92", 00:22:16.526 "strip_size_kb": 0, 00:22:16.526 "state": "online", 00:22:16.526 "raid_level": "raid1", 00:22:16.526 "superblock": true, 00:22:16.526 "num_base_bdevs": 4, 00:22:16.526 "num_base_bdevs_discovered": 4, 00:22:16.526 "num_base_bdevs_operational": 4, 00:22:16.526 "base_bdevs_list": [ 00:22:16.526 { 00:22:16.526 "name": "BaseBdev1", 00:22:16.526 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:16.526 "is_configured": true, 00:22:16.526 "data_offset": 2048, 00:22:16.526 "data_size": 63488 00:22:16.526 }, 00:22:16.526 { 00:22:16.526 "name": "BaseBdev2", 00:22:16.526 "uuid": "e25a7453-4b48-4250-b031-770dee7ab835", 00:22:16.526 "is_configured": true, 00:22:16.526 "data_offset": 2048, 00:22:16.526 "data_size": 63488 00:22:16.526 }, 00:22:16.526 { 00:22:16.526 "name": "BaseBdev3", 00:22:16.526 "uuid": "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f", 00:22:16.526 "is_configured": true, 00:22:16.526 "data_offset": 2048, 00:22:16.526 "data_size": 63488 00:22:16.526 }, 00:22:16.526 { 00:22:16.526 "name": "BaseBdev4", 00:22:16.526 "uuid": "3b656391-09c5-4b2a-99b7-4ac8c30c6665", 00:22:16.526 "is_configured": true, 00:22:16.526 "data_offset": 2048, 00:22:16.526 "data_size": 63488 00:22:16.526 } 00:22:16.526 ] 00:22:16.526 }' 00:22:16.526 10:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:16.526 10:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:17.092 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:17.350 [2024-07-15 10:29:54.409054] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:17.350 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:17.350 "name": "Existed_Raid", 00:22:17.350 "aliases": [ 00:22:17.350 "5ba59be7-ca19-448f-9906-8101f6c2ea92" 00:22:17.350 ], 00:22:17.350 "product_name": "Raid Volume", 00:22:17.350 "block_size": 512, 00:22:17.350 "num_blocks": 63488, 00:22:17.350 "uuid": "5ba59be7-ca19-448f-9906-8101f6c2ea92", 00:22:17.350 "assigned_rate_limits": { 00:22:17.351 "rw_ios_per_sec": 0, 00:22:17.351 "rw_mbytes_per_sec": 0, 00:22:17.351 "r_mbytes_per_sec": 0, 00:22:17.351 "w_mbytes_per_sec": 0 00:22:17.351 }, 00:22:17.351 "claimed": false, 00:22:17.351 "zoned": false, 00:22:17.351 "supported_io_types": { 00:22:17.351 "read": true, 00:22:17.351 "write": true, 00:22:17.351 "unmap": false, 00:22:17.351 "flush": false, 00:22:17.351 "reset": true, 00:22:17.351 "nvme_admin": false, 00:22:17.351 "nvme_io": false, 00:22:17.351 "nvme_io_md": false, 00:22:17.351 "write_zeroes": true, 00:22:17.351 "zcopy": false, 00:22:17.351 "get_zone_info": false, 00:22:17.351 "zone_management": false, 00:22:17.351 "zone_append": false, 00:22:17.351 "compare": false, 00:22:17.351 "compare_and_write": false, 00:22:17.351 "abort": false, 00:22:17.351 "seek_hole": false, 00:22:17.351 "seek_data": false, 00:22:17.351 "copy": false, 00:22:17.351 "nvme_iov_md": false 00:22:17.351 }, 00:22:17.351 "memory_domains": [ 00:22:17.351 { 00:22:17.351 "dma_device_id": "system", 00:22:17.351 "dma_device_type": 1 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.351 "dma_device_type": 2 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "dma_device_id": "system", 00:22:17.351 "dma_device_type": 1 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.351 "dma_device_type": 2 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "dma_device_id": "system", 00:22:17.351 "dma_device_type": 1 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.351 "dma_device_type": 2 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "dma_device_id": "system", 00:22:17.351 "dma_device_type": 1 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.351 "dma_device_type": 2 00:22:17.351 } 00:22:17.351 ], 00:22:17.351 "driver_specific": { 00:22:17.351 "raid": { 00:22:17.351 "uuid": "5ba59be7-ca19-448f-9906-8101f6c2ea92", 00:22:17.351 "strip_size_kb": 0, 00:22:17.351 "state": "online", 00:22:17.351 "raid_level": "raid1", 00:22:17.351 "superblock": true, 00:22:17.351 "num_base_bdevs": 4, 00:22:17.351 "num_base_bdevs_discovered": 4, 00:22:17.351 "num_base_bdevs_operational": 4, 00:22:17.351 "base_bdevs_list": [ 00:22:17.351 { 00:22:17.351 "name": "BaseBdev1", 00:22:17.351 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:17.351 "is_configured": true, 00:22:17.351 "data_offset": 2048, 00:22:17.351 "data_size": 63488 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "name": "BaseBdev2", 00:22:17.351 "uuid": "e25a7453-4b48-4250-b031-770dee7ab835", 00:22:17.351 "is_configured": true, 00:22:17.351 "data_offset": 2048, 00:22:17.351 "data_size": 63488 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "name": "BaseBdev3", 00:22:17.351 "uuid": "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f", 00:22:17.351 "is_configured": true, 00:22:17.351 "data_offset": 2048, 00:22:17.351 "data_size": 63488 00:22:17.351 }, 00:22:17.351 { 00:22:17.351 "name": "BaseBdev4", 00:22:17.351 "uuid": "3b656391-09c5-4b2a-99b7-4ac8c30c6665", 00:22:17.351 "is_configured": true, 00:22:17.351 "data_offset": 2048, 00:22:17.351 "data_size": 63488 00:22:17.351 } 00:22:17.351 ] 00:22:17.351 } 00:22:17.351 } 00:22:17.351 }' 00:22:17.351 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:17.351 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:17.351 BaseBdev2 00:22:17.351 BaseBdev3 00:22:17.351 BaseBdev4' 00:22:17.351 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:17.351 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:17.351 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:17.611 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:17.611 "name": "BaseBdev1", 00:22:17.611 "aliases": [ 00:22:17.611 "87a12649-4079-4fd2-ba6e-b9372bc700b6" 00:22:17.611 ], 00:22:17.611 "product_name": "Malloc disk", 00:22:17.611 "block_size": 512, 00:22:17.611 "num_blocks": 65536, 00:22:17.611 "uuid": "87a12649-4079-4fd2-ba6e-b9372bc700b6", 00:22:17.611 "assigned_rate_limits": { 00:22:17.611 "rw_ios_per_sec": 0, 00:22:17.611 "rw_mbytes_per_sec": 0, 00:22:17.611 "r_mbytes_per_sec": 0, 00:22:17.611 "w_mbytes_per_sec": 0 00:22:17.611 }, 00:22:17.611 "claimed": true, 00:22:17.611 "claim_type": "exclusive_write", 00:22:17.611 "zoned": false, 00:22:17.611 "supported_io_types": { 00:22:17.611 "read": true, 00:22:17.611 "write": true, 00:22:17.611 "unmap": true, 00:22:17.611 "flush": true, 00:22:17.611 "reset": true, 00:22:17.611 "nvme_admin": false, 00:22:17.611 "nvme_io": false, 00:22:17.611 "nvme_io_md": false, 00:22:17.611 "write_zeroes": true, 00:22:17.611 "zcopy": true, 00:22:17.611 "get_zone_info": false, 00:22:17.611 "zone_management": false, 00:22:17.611 "zone_append": false, 00:22:17.611 "compare": false, 00:22:17.611 "compare_and_write": false, 00:22:17.611 "abort": true, 00:22:17.611 "seek_hole": false, 00:22:17.611 "seek_data": false, 00:22:17.611 "copy": true, 00:22:17.611 "nvme_iov_md": false 00:22:17.611 }, 00:22:17.611 "memory_domains": [ 00:22:17.611 { 00:22:17.611 "dma_device_id": "system", 00:22:17.611 "dma_device_type": 1 00:22:17.611 }, 00:22:17.611 { 00:22:17.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.611 "dma_device_type": 2 00:22:17.611 } 00:22:17.611 ], 00:22:17.611 "driver_specific": {} 00:22:17.611 }' 00:22:17.611 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.611 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:17.870 10:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.870 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:18.129 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:18.129 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:18.129 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:18.129 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:18.129 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:18.129 "name": "BaseBdev2", 00:22:18.129 "aliases": [ 00:22:18.129 "e25a7453-4b48-4250-b031-770dee7ab835" 00:22:18.129 ], 00:22:18.129 "product_name": "Malloc disk", 00:22:18.129 "block_size": 512, 00:22:18.129 "num_blocks": 65536, 00:22:18.129 "uuid": "e25a7453-4b48-4250-b031-770dee7ab835", 00:22:18.129 "assigned_rate_limits": { 00:22:18.129 "rw_ios_per_sec": 0, 00:22:18.129 "rw_mbytes_per_sec": 0, 00:22:18.129 "r_mbytes_per_sec": 0, 00:22:18.129 "w_mbytes_per_sec": 0 00:22:18.129 }, 00:22:18.129 "claimed": true, 00:22:18.129 "claim_type": "exclusive_write", 00:22:18.129 "zoned": false, 00:22:18.129 "supported_io_types": { 00:22:18.129 "read": true, 00:22:18.129 "write": true, 00:22:18.129 "unmap": true, 00:22:18.129 "flush": true, 00:22:18.129 "reset": true, 00:22:18.129 "nvme_admin": false, 00:22:18.129 "nvme_io": false, 00:22:18.129 "nvme_io_md": false, 00:22:18.129 "write_zeroes": true, 00:22:18.129 "zcopy": true, 00:22:18.129 "get_zone_info": false, 00:22:18.129 "zone_management": false, 00:22:18.129 "zone_append": false, 00:22:18.129 "compare": false, 00:22:18.129 "compare_and_write": false, 00:22:18.129 "abort": true, 00:22:18.129 "seek_hole": false, 00:22:18.129 "seek_data": false, 00:22:18.129 "copy": true, 00:22:18.129 "nvme_iov_md": false 00:22:18.129 }, 00:22:18.129 "memory_domains": [ 00:22:18.129 { 00:22:18.129 "dma_device_id": "system", 00:22:18.129 "dma_device_type": 1 00:22:18.129 }, 00:22:18.129 { 00:22:18.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:18.129 "dma_device_type": 2 00:22:18.129 } 00:22:18.129 ], 00:22:18.129 "driver_specific": {} 00:22:18.129 }' 00:22:18.129 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.387 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.387 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:18.387 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:18.387 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:18.387 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:18.387 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:18.387 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:18.645 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:18.645 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:18.645 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:18.645 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:18.646 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:18.646 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:18.646 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:18.904 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:18.904 "name": "BaseBdev3", 00:22:18.904 "aliases": [ 00:22:18.904 "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f" 00:22:18.904 ], 00:22:18.904 "product_name": "Malloc disk", 00:22:18.904 "block_size": 512, 00:22:18.904 "num_blocks": 65536, 00:22:18.904 "uuid": "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f", 00:22:18.904 "assigned_rate_limits": { 00:22:18.904 "rw_ios_per_sec": 0, 00:22:18.904 "rw_mbytes_per_sec": 0, 00:22:18.904 "r_mbytes_per_sec": 0, 00:22:18.904 "w_mbytes_per_sec": 0 00:22:18.904 }, 00:22:18.904 "claimed": true, 00:22:18.904 "claim_type": "exclusive_write", 00:22:18.904 "zoned": false, 00:22:18.904 "supported_io_types": { 00:22:18.904 "read": true, 00:22:18.904 "write": true, 00:22:18.904 "unmap": true, 00:22:18.904 "flush": true, 00:22:18.904 "reset": true, 00:22:18.904 "nvme_admin": false, 00:22:18.904 "nvme_io": false, 00:22:18.904 "nvme_io_md": false, 00:22:18.904 "write_zeroes": true, 00:22:18.904 "zcopy": true, 00:22:18.904 "get_zone_info": false, 00:22:18.904 "zone_management": false, 00:22:18.904 "zone_append": false, 00:22:18.904 "compare": false, 00:22:18.904 "compare_and_write": false, 00:22:18.904 "abort": true, 00:22:18.904 "seek_hole": false, 00:22:18.904 "seek_data": false, 00:22:18.904 "copy": true, 00:22:18.904 "nvme_iov_md": false 00:22:18.904 }, 00:22:18.904 "memory_domains": [ 00:22:18.904 { 00:22:18.904 "dma_device_id": "system", 00:22:18.904 "dma_device_type": 1 00:22:18.904 }, 00:22:18.904 { 00:22:18.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:18.904 "dma_device_type": 2 00:22:18.904 } 00:22:18.904 ], 00:22:18.904 "driver_specific": {} 00:22:18.904 }' 00:22:18.904 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.904 10:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:18.904 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:18.904 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:18.904 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:19.162 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:19.420 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:19.420 "name": "BaseBdev4", 00:22:19.420 "aliases": [ 00:22:19.420 "3b656391-09c5-4b2a-99b7-4ac8c30c6665" 00:22:19.420 ], 00:22:19.420 "product_name": "Malloc disk", 00:22:19.420 "block_size": 512, 00:22:19.420 "num_blocks": 65536, 00:22:19.420 "uuid": "3b656391-09c5-4b2a-99b7-4ac8c30c6665", 00:22:19.420 "assigned_rate_limits": { 00:22:19.420 "rw_ios_per_sec": 0, 00:22:19.420 "rw_mbytes_per_sec": 0, 00:22:19.420 "r_mbytes_per_sec": 0, 00:22:19.420 "w_mbytes_per_sec": 0 00:22:19.420 }, 00:22:19.420 "claimed": true, 00:22:19.420 "claim_type": "exclusive_write", 00:22:19.420 "zoned": false, 00:22:19.420 "supported_io_types": { 00:22:19.420 "read": true, 00:22:19.420 "write": true, 00:22:19.420 "unmap": true, 00:22:19.420 "flush": true, 00:22:19.420 "reset": true, 00:22:19.420 "nvme_admin": false, 00:22:19.420 "nvme_io": false, 00:22:19.420 "nvme_io_md": false, 00:22:19.420 "write_zeroes": true, 00:22:19.420 "zcopy": true, 00:22:19.420 "get_zone_info": false, 00:22:19.420 "zone_management": false, 00:22:19.420 "zone_append": false, 00:22:19.420 "compare": false, 00:22:19.420 "compare_and_write": false, 00:22:19.420 "abort": true, 00:22:19.420 "seek_hole": false, 00:22:19.420 "seek_data": false, 00:22:19.420 "copy": true, 00:22:19.420 "nvme_iov_md": false 00:22:19.420 }, 00:22:19.420 "memory_domains": [ 00:22:19.420 { 00:22:19.420 "dma_device_id": "system", 00:22:19.420 "dma_device_type": 1 00:22:19.420 }, 00:22:19.420 { 00:22:19.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:19.420 "dma_device_type": 2 00:22:19.420 } 00:22:19.420 ], 00:22:19.420 "driver_specific": {} 00:22:19.420 }' 00:22:19.420 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:19.420 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:19.420 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:19.420 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:19.678 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:19.678 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:19.678 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:19.678 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:19.678 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:19.678 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:19.678 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:19.937 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:19.937 10:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:19.937 [2024-07-15 10:29:57.107957] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:19.937 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:19.938 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:19.938 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:19.938 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:19.938 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:19.938 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.196 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.196 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:20.196 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.196 "name": "Existed_Raid", 00:22:20.196 "uuid": "5ba59be7-ca19-448f-9906-8101f6c2ea92", 00:22:20.196 "strip_size_kb": 0, 00:22:20.196 "state": "online", 00:22:20.196 "raid_level": "raid1", 00:22:20.196 "superblock": true, 00:22:20.196 "num_base_bdevs": 4, 00:22:20.196 "num_base_bdevs_discovered": 3, 00:22:20.196 "num_base_bdevs_operational": 3, 00:22:20.196 "base_bdevs_list": [ 00:22:20.197 { 00:22:20.197 "name": null, 00:22:20.197 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.197 "is_configured": false, 00:22:20.197 "data_offset": 2048, 00:22:20.197 "data_size": 63488 00:22:20.197 }, 00:22:20.197 { 00:22:20.197 "name": "BaseBdev2", 00:22:20.197 "uuid": "e25a7453-4b48-4250-b031-770dee7ab835", 00:22:20.197 "is_configured": true, 00:22:20.197 "data_offset": 2048, 00:22:20.197 "data_size": 63488 00:22:20.197 }, 00:22:20.197 { 00:22:20.197 "name": "BaseBdev3", 00:22:20.197 "uuid": "d3d850e6-dff2-4c5f-888e-3b67ebeebb4f", 00:22:20.197 "is_configured": true, 00:22:20.197 "data_offset": 2048, 00:22:20.197 "data_size": 63488 00:22:20.197 }, 00:22:20.197 { 00:22:20.197 "name": "BaseBdev4", 00:22:20.197 "uuid": "3b656391-09c5-4b2a-99b7-4ac8c30c6665", 00:22:20.197 "is_configured": true, 00:22:20.197 "data_offset": 2048, 00:22:20.197 "data_size": 63488 00:22:20.197 } 00:22:20.197 ] 00:22:20.197 }' 00:22:20.197 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.197 10:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:21.133 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:21.133 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:21.133 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.133 10:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:21.133 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:21.133 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:21.133 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:21.392 [2024-07-15 10:29:58.452629] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:21.392 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:21.392 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:21.392 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.392 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:21.649 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:21.649 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:21.649 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:21.906 [2024-07-15 10:29:58.958532] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:21.906 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:21.906 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:21.906 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.906 10:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:22.164 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:22.164 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:22.164 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:22.423 [2024-07-15 10:29:59.456225] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:22.423 [2024-07-15 10:29:59.456310] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:22.423 [2024-07-15 10:29:59.467095] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:22.423 [2024-07-15 10:29:59.467130] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:22.423 [2024-07-15 10:29:59.467142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1594350 name Existed_Raid, state offline 00:22:22.423 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:22.423 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:22.423 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.423 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:22.681 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:22.681 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:22.681 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:22.681 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:22.681 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:22.681 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:22.939 BaseBdev2 00:22:22.939 10:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:22.939 10:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:22.939 10:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:22.939 10:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:22.939 10:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:22.939 10:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:22.939 10:29:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:23.198 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:23.456 [ 00:22:23.456 { 00:22:23.456 "name": "BaseBdev2", 00:22:23.456 "aliases": [ 00:22:23.456 "0209e759-be72-4f74-a472-267421e9bc9c" 00:22:23.456 ], 00:22:23.456 "product_name": "Malloc disk", 00:22:23.456 "block_size": 512, 00:22:23.456 "num_blocks": 65536, 00:22:23.456 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:23.456 "assigned_rate_limits": { 00:22:23.456 "rw_ios_per_sec": 0, 00:22:23.456 "rw_mbytes_per_sec": 0, 00:22:23.456 "r_mbytes_per_sec": 0, 00:22:23.456 "w_mbytes_per_sec": 0 00:22:23.456 }, 00:22:23.456 "claimed": false, 00:22:23.456 "zoned": false, 00:22:23.456 "supported_io_types": { 00:22:23.456 "read": true, 00:22:23.456 "write": true, 00:22:23.456 "unmap": true, 00:22:23.456 "flush": true, 00:22:23.456 "reset": true, 00:22:23.456 "nvme_admin": false, 00:22:23.456 "nvme_io": false, 00:22:23.456 "nvme_io_md": false, 00:22:23.456 "write_zeroes": true, 00:22:23.456 "zcopy": true, 00:22:23.456 "get_zone_info": false, 00:22:23.456 "zone_management": false, 00:22:23.456 "zone_append": false, 00:22:23.456 "compare": false, 00:22:23.456 "compare_and_write": false, 00:22:23.456 "abort": true, 00:22:23.456 "seek_hole": false, 00:22:23.456 "seek_data": false, 00:22:23.456 "copy": true, 00:22:23.456 "nvme_iov_md": false 00:22:23.456 }, 00:22:23.456 "memory_domains": [ 00:22:23.456 { 00:22:23.456 "dma_device_id": "system", 00:22:23.456 "dma_device_type": 1 00:22:23.456 }, 00:22:23.456 { 00:22:23.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:23.456 "dma_device_type": 2 00:22:23.456 } 00:22:23.456 ], 00:22:23.456 "driver_specific": {} 00:22:23.456 } 00:22:23.456 ] 00:22:23.456 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:23.456 10:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:23.456 10:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:23.456 10:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:23.714 BaseBdev3 00:22:23.715 10:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:23.715 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:23.715 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:23.715 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:23.715 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:23.715 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:23.715 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:23.973 10:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:24.230 [ 00:22:24.230 { 00:22:24.230 "name": "BaseBdev3", 00:22:24.230 "aliases": [ 00:22:24.230 "c20cf15d-c0b6-4049-819b-34cc0269dd1a" 00:22:24.230 ], 00:22:24.230 "product_name": "Malloc disk", 00:22:24.230 "block_size": 512, 00:22:24.230 "num_blocks": 65536, 00:22:24.230 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:24.230 "assigned_rate_limits": { 00:22:24.230 "rw_ios_per_sec": 0, 00:22:24.230 "rw_mbytes_per_sec": 0, 00:22:24.230 "r_mbytes_per_sec": 0, 00:22:24.230 "w_mbytes_per_sec": 0 00:22:24.230 }, 00:22:24.230 "claimed": false, 00:22:24.230 "zoned": false, 00:22:24.230 "supported_io_types": { 00:22:24.230 "read": true, 00:22:24.230 "write": true, 00:22:24.230 "unmap": true, 00:22:24.230 "flush": true, 00:22:24.230 "reset": true, 00:22:24.230 "nvme_admin": false, 00:22:24.230 "nvme_io": false, 00:22:24.230 "nvme_io_md": false, 00:22:24.230 "write_zeroes": true, 00:22:24.230 "zcopy": true, 00:22:24.230 "get_zone_info": false, 00:22:24.230 "zone_management": false, 00:22:24.230 "zone_append": false, 00:22:24.230 "compare": false, 00:22:24.230 "compare_and_write": false, 00:22:24.230 "abort": true, 00:22:24.230 "seek_hole": false, 00:22:24.230 "seek_data": false, 00:22:24.230 "copy": true, 00:22:24.230 "nvme_iov_md": false 00:22:24.230 }, 00:22:24.230 "memory_domains": [ 00:22:24.230 { 00:22:24.230 "dma_device_id": "system", 00:22:24.230 "dma_device_type": 1 00:22:24.230 }, 00:22:24.230 { 00:22:24.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.230 "dma_device_type": 2 00:22:24.230 } 00:22:24.230 ], 00:22:24.230 "driver_specific": {} 00:22:24.230 } 00:22:24.230 ] 00:22:24.230 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:24.230 10:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:24.231 10:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:24.231 10:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:24.231 BaseBdev4 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:24.488 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:24.746 [ 00:22:24.746 { 00:22:24.746 "name": "BaseBdev4", 00:22:24.746 "aliases": [ 00:22:24.746 "4071f189-8e09-413a-a384-3ad3cd185d19" 00:22:24.746 ], 00:22:24.746 "product_name": "Malloc disk", 00:22:24.746 "block_size": 512, 00:22:24.746 "num_blocks": 65536, 00:22:24.746 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:24.746 "assigned_rate_limits": { 00:22:24.746 "rw_ios_per_sec": 0, 00:22:24.746 "rw_mbytes_per_sec": 0, 00:22:24.746 "r_mbytes_per_sec": 0, 00:22:24.746 "w_mbytes_per_sec": 0 00:22:24.746 }, 00:22:24.746 "claimed": false, 00:22:24.746 "zoned": false, 00:22:24.746 "supported_io_types": { 00:22:24.746 "read": true, 00:22:24.746 "write": true, 00:22:24.746 "unmap": true, 00:22:24.746 "flush": true, 00:22:24.746 "reset": true, 00:22:24.746 "nvme_admin": false, 00:22:24.746 "nvme_io": false, 00:22:24.746 "nvme_io_md": false, 00:22:24.746 "write_zeroes": true, 00:22:24.746 "zcopy": true, 00:22:24.746 "get_zone_info": false, 00:22:24.746 "zone_management": false, 00:22:24.746 "zone_append": false, 00:22:24.746 "compare": false, 00:22:24.746 "compare_and_write": false, 00:22:24.746 "abort": true, 00:22:24.746 "seek_hole": false, 00:22:24.746 "seek_data": false, 00:22:24.746 "copy": true, 00:22:24.746 "nvme_iov_md": false 00:22:24.746 }, 00:22:24.746 "memory_domains": [ 00:22:24.746 { 00:22:24.746 "dma_device_id": "system", 00:22:24.746 "dma_device_type": 1 00:22:24.746 }, 00:22:24.746 { 00:22:24.746 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:24.746 "dma_device_type": 2 00:22:24.746 } 00:22:24.746 ], 00:22:24.746 "driver_specific": {} 00:22:24.746 } 00:22:24.746 ] 00:22:24.746 10:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:24.746 10:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:24.746 10:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:24.746 10:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:25.005 [2024-07-15 10:30:02.144310] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:25.005 [2024-07-15 10:30:02.144351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:25.005 [2024-07-15 10:30:02.144370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:25.005 [2024-07-15 10:30:02.145707] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:25.005 [2024-07-15 10:30:02.145748] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.005 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:25.264 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:25.264 "name": "Existed_Raid", 00:22:25.264 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:25.264 "strip_size_kb": 0, 00:22:25.264 "state": "configuring", 00:22:25.264 "raid_level": "raid1", 00:22:25.264 "superblock": true, 00:22:25.264 "num_base_bdevs": 4, 00:22:25.264 "num_base_bdevs_discovered": 3, 00:22:25.264 "num_base_bdevs_operational": 4, 00:22:25.264 "base_bdevs_list": [ 00:22:25.264 { 00:22:25.264 "name": "BaseBdev1", 00:22:25.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:25.264 "is_configured": false, 00:22:25.264 "data_offset": 0, 00:22:25.264 "data_size": 0 00:22:25.264 }, 00:22:25.264 { 00:22:25.264 "name": "BaseBdev2", 00:22:25.264 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:25.264 "is_configured": true, 00:22:25.264 "data_offset": 2048, 00:22:25.264 "data_size": 63488 00:22:25.264 }, 00:22:25.264 { 00:22:25.264 "name": "BaseBdev3", 00:22:25.264 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:25.264 "is_configured": true, 00:22:25.264 "data_offset": 2048, 00:22:25.264 "data_size": 63488 00:22:25.264 }, 00:22:25.264 { 00:22:25.264 "name": "BaseBdev4", 00:22:25.264 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:25.264 "is_configured": true, 00:22:25.264 "data_offset": 2048, 00:22:25.264 "data_size": 63488 00:22:25.264 } 00:22:25.264 ] 00:22:25.264 }' 00:22:25.264 10:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:25.264 10:30:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:25.828 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:26.392 [2024-07-15 10:30:03.483845] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.392 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:26.649 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:26.649 "name": "Existed_Raid", 00:22:26.649 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:26.649 "strip_size_kb": 0, 00:22:26.649 "state": "configuring", 00:22:26.650 "raid_level": "raid1", 00:22:26.650 "superblock": true, 00:22:26.650 "num_base_bdevs": 4, 00:22:26.650 "num_base_bdevs_discovered": 2, 00:22:26.650 "num_base_bdevs_operational": 4, 00:22:26.650 "base_bdevs_list": [ 00:22:26.650 { 00:22:26.650 "name": "BaseBdev1", 00:22:26.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:26.650 "is_configured": false, 00:22:26.650 "data_offset": 0, 00:22:26.650 "data_size": 0 00:22:26.650 }, 00:22:26.650 { 00:22:26.650 "name": null, 00:22:26.650 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:26.650 "is_configured": false, 00:22:26.650 "data_offset": 2048, 00:22:26.650 "data_size": 63488 00:22:26.650 }, 00:22:26.650 { 00:22:26.650 "name": "BaseBdev3", 00:22:26.650 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:26.650 "is_configured": true, 00:22:26.650 "data_offset": 2048, 00:22:26.650 "data_size": 63488 00:22:26.650 }, 00:22:26.650 { 00:22:26.650 "name": "BaseBdev4", 00:22:26.650 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:26.650 "is_configured": true, 00:22:26.650 "data_offset": 2048, 00:22:26.650 "data_size": 63488 00:22:26.650 } 00:22:26.650 ] 00:22:26.650 }' 00:22:26.650 10:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:26.650 10:30:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:27.215 10:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.215 10:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:27.472 10:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:27.472 10:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:27.730 [2024-07-15 10:30:04.759810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:27.730 BaseBdev1 00:22:27.730 10:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:27.730 10:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:27.730 10:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:27.730 10:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:27.730 10:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:27.730 10:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:27.730 10:30:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:28.295 10:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:28.861 [ 00:22:28.861 { 00:22:28.861 "name": "BaseBdev1", 00:22:28.861 "aliases": [ 00:22:28.861 "f16b78b9-3071-4d78-a063-799f41085277" 00:22:28.861 ], 00:22:28.861 "product_name": "Malloc disk", 00:22:28.861 "block_size": 512, 00:22:28.861 "num_blocks": 65536, 00:22:28.861 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:28.861 "assigned_rate_limits": { 00:22:28.861 "rw_ios_per_sec": 0, 00:22:28.861 "rw_mbytes_per_sec": 0, 00:22:28.861 "r_mbytes_per_sec": 0, 00:22:28.861 "w_mbytes_per_sec": 0 00:22:28.861 }, 00:22:28.861 "claimed": true, 00:22:28.861 "claim_type": "exclusive_write", 00:22:28.861 "zoned": false, 00:22:28.861 "supported_io_types": { 00:22:28.861 "read": true, 00:22:28.861 "write": true, 00:22:28.861 "unmap": true, 00:22:28.861 "flush": true, 00:22:28.861 "reset": true, 00:22:28.861 "nvme_admin": false, 00:22:28.861 "nvme_io": false, 00:22:28.861 "nvme_io_md": false, 00:22:28.861 "write_zeroes": true, 00:22:28.861 "zcopy": true, 00:22:28.861 "get_zone_info": false, 00:22:28.861 "zone_management": false, 00:22:28.861 "zone_append": false, 00:22:28.861 "compare": false, 00:22:28.861 "compare_and_write": false, 00:22:28.861 "abort": true, 00:22:28.861 "seek_hole": false, 00:22:28.861 "seek_data": false, 00:22:28.861 "copy": true, 00:22:28.861 "nvme_iov_md": false 00:22:28.861 }, 00:22:28.861 "memory_domains": [ 00:22:28.862 { 00:22:28.862 "dma_device_id": "system", 00:22:28.862 "dma_device_type": 1 00:22:28.862 }, 00:22:28.862 { 00:22:28.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:28.862 "dma_device_type": 2 00:22:28.862 } 00:22:28.862 ], 00:22:28.862 "driver_specific": {} 00:22:28.862 } 00:22:28.862 ] 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.862 10:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:28.862 10:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.862 "name": "Existed_Raid", 00:22:28.862 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:28.862 "strip_size_kb": 0, 00:22:28.862 "state": "configuring", 00:22:28.862 "raid_level": "raid1", 00:22:28.862 "superblock": true, 00:22:28.862 "num_base_bdevs": 4, 00:22:28.862 "num_base_bdevs_discovered": 3, 00:22:28.862 "num_base_bdevs_operational": 4, 00:22:28.862 "base_bdevs_list": [ 00:22:28.862 { 00:22:28.862 "name": "BaseBdev1", 00:22:28.862 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:28.862 "is_configured": true, 00:22:28.862 "data_offset": 2048, 00:22:28.862 "data_size": 63488 00:22:28.862 }, 00:22:28.862 { 00:22:28.862 "name": null, 00:22:28.862 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:28.862 "is_configured": false, 00:22:28.862 "data_offset": 2048, 00:22:28.862 "data_size": 63488 00:22:28.862 }, 00:22:28.862 { 00:22:28.862 "name": "BaseBdev3", 00:22:28.862 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:28.862 "is_configured": true, 00:22:28.862 "data_offset": 2048, 00:22:28.862 "data_size": 63488 00:22:28.862 }, 00:22:28.862 { 00:22:28.862 "name": "BaseBdev4", 00:22:28.862 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:28.862 "is_configured": true, 00:22:28.862 "data_offset": 2048, 00:22:28.862 "data_size": 63488 00:22:28.862 } 00:22:28.862 ] 00:22:28.862 }' 00:22:28.862 10:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.862 10:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:29.796 10:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.796 10:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:29.796 10:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:29.796 10:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:30.054 [2024-07-15 10:30:07.114130] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.054 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:30.345 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:30.345 "name": "Existed_Raid", 00:22:30.345 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:30.345 "strip_size_kb": 0, 00:22:30.345 "state": "configuring", 00:22:30.345 "raid_level": "raid1", 00:22:30.345 "superblock": true, 00:22:30.345 "num_base_bdevs": 4, 00:22:30.345 "num_base_bdevs_discovered": 2, 00:22:30.345 "num_base_bdevs_operational": 4, 00:22:30.345 "base_bdevs_list": [ 00:22:30.345 { 00:22:30.345 "name": "BaseBdev1", 00:22:30.345 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:30.345 "is_configured": true, 00:22:30.345 "data_offset": 2048, 00:22:30.345 "data_size": 63488 00:22:30.345 }, 00:22:30.345 { 00:22:30.345 "name": null, 00:22:30.345 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:30.345 "is_configured": false, 00:22:30.345 "data_offset": 2048, 00:22:30.345 "data_size": 63488 00:22:30.345 }, 00:22:30.345 { 00:22:30.345 "name": null, 00:22:30.345 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:30.345 "is_configured": false, 00:22:30.345 "data_offset": 2048, 00:22:30.345 "data_size": 63488 00:22:30.345 }, 00:22:30.345 { 00:22:30.345 "name": "BaseBdev4", 00:22:30.345 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:30.345 "is_configured": true, 00:22:30.345 "data_offset": 2048, 00:22:30.345 "data_size": 63488 00:22:30.345 } 00:22:30.345 ] 00:22:30.345 }' 00:22:30.345 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:30.345 10:30:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:30.911 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.911 10:30:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:30.911 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:30.911 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:31.478 [2024-07-15 10:30:08.545956] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.478 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:31.734 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.734 "name": "Existed_Raid", 00:22:31.734 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:31.734 "strip_size_kb": 0, 00:22:31.734 "state": "configuring", 00:22:31.734 "raid_level": "raid1", 00:22:31.734 "superblock": true, 00:22:31.734 "num_base_bdevs": 4, 00:22:31.734 "num_base_bdevs_discovered": 3, 00:22:31.734 "num_base_bdevs_operational": 4, 00:22:31.734 "base_bdevs_list": [ 00:22:31.734 { 00:22:31.734 "name": "BaseBdev1", 00:22:31.734 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:31.734 "is_configured": true, 00:22:31.734 "data_offset": 2048, 00:22:31.734 "data_size": 63488 00:22:31.734 }, 00:22:31.734 { 00:22:31.734 "name": null, 00:22:31.734 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:31.734 "is_configured": false, 00:22:31.734 "data_offset": 2048, 00:22:31.734 "data_size": 63488 00:22:31.734 }, 00:22:31.734 { 00:22:31.734 "name": "BaseBdev3", 00:22:31.734 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:31.735 "is_configured": true, 00:22:31.735 "data_offset": 2048, 00:22:31.735 "data_size": 63488 00:22:31.735 }, 00:22:31.735 { 00:22:31.735 "name": "BaseBdev4", 00:22:31.735 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:31.735 "is_configured": true, 00:22:31.735 "data_offset": 2048, 00:22:31.735 "data_size": 63488 00:22:31.735 } 00:22:31.735 ] 00:22:31.735 }' 00:22:31.735 10:30:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.735 10:30:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:32.299 10:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.299 10:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:32.557 10:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:32.557 10:30:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:33.120 [2024-07-15 10:30:10.158246] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.120 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:33.377 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.377 "name": "Existed_Raid", 00:22:33.377 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:33.377 "strip_size_kb": 0, 00:22:33.377 "state": "configuring", 00:22:33.377 "raid_level": "raid1", 00:22:33.377 "superblock": true, 00:22:33.377 "num_base_bdevs": 4, 00:22:33.377 "num_base_bdevs_discovered": 2, 00:22:33.377 "num_base_bdevs_operational": 4, 00:22:33.377 "base_bdevs_list": [ 00:22:33.377 { 00:22:33.377 "name": null, 00:22:33.377 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:33.377 "is_configured": false, 00:22:33.377 "data_offset": 2048, 00:22:33.377 "data_size": 63488 00:22:33.377 }, 00:22:33.377 { 00:22:33.377 "name": null, 00:22:33.377 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:33.377 "is_configured": false, 00:22:33.377 "data_offset": 2048, 00:22:33.377 "data_size": 63488 00:22:33.377 }, 00:22:33.377 { 00:22:33.377 "name": "BaseBdev3", 00:22:33.377 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:33.377 "is_configured": true, 00:22:33.377 "data_offset": 2048, 00:22:33.377 "data_size": 63488 00:22:33.377 }, 00:22:33.377 { 00:22:33.377 "name": "BaseBdev4", 00:22:33.377 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:33.377 "is_configured": true, 00:22:33.377 "data_offset": 2048, 00:22:33.377 "data_size": 63488 00:22:33.377 } 00:22:33.377 ] 00:22:33.377 }' 00:22:33.377 10:30:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.377 10:30:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:33.941 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.941 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:34.246 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:22:34.246 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:22:34.504 [2024-07-15 10:30:11.526335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.504 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.762 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.762 "name": "Existed_Raid", 00:22:34.762 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:34.762 "strip_size_kb": 0, 00:22:34.762 "state": "configuring", 00:22:34.762 "raid_level": "raid1", 00:22:34.762 "superblock": true, 00:22:34.763 "num_base_bdevs": 4, 00:22:34.763 "num_base_bdevs_discovered": 3, 00:22:34.763 "num_base_bdevs_operational": 4, 00:22:34.763 "base_bdevs_list": [ 00:22:34.763 { 00:22:34.763 "name": null, 00:22:34.763 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:34.763 "is_configured": false, 00:22:34.763 "data_offset": 2048, 00:22:34.763 "data_size": 63488 00:22:34.763 }, 00:22:34.763 { 00:22:34.763 "name": "BaseBdev2", 00:22:34.763 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:34.763 "is_configured": true, 00:22:34.763 "data_offset": 2048, 00:22:34.763 "data_size": 63488 00:22:34.763 }, 00:22:34.763 { 00:22:34.763 "name": "BaseBdev3", 00:22:34.763 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:34.763 "is_configured": true, 00:22:34.763 "data_offset": 2048, 00:22:34.763 "data_size": 63488 00:22:34.763 }, 00:22:34.763 { 00:22:34.763 "name": "BaseBdev4", 00:22:34.763 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:34.763 "is_configured": true, 00:22:34.763 "data_offset": 2048, 00:22:34.763 "data_size": 63488 00:22:34.763 } 00:22:34.763 ] 00:22:34.763 }' 00:22:34.763 10:30:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.763 10:30:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:35.328 10:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:35.328 10:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.586 10:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:22:35.586 10:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.586 10:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:22:35.845 10:30:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f16b78b9-3071-4d78-a063-799f41085277 00:22:36.103 [2024-07-15 10:30:13.093830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:22:36.103 [2024-07-15 10:30:13.094013] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1596180 00:22:36.103 [2024-07-15 10:30:13.094028] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:36.103 [2024-07-15 10:30:13.094206] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1596c20 00:22:36.103 [2024-07-15 10:30:13.094336] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1596180 00:22:36.103 [2024-07-15 10:30:13.094346] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1596180 00:22:36.103 [2024-07-15 10:30:13.094441] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:36.103 NewBaseBdev 00:22:36.103 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:22:36.103 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:22:36.103 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:36.103 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:22:36.103 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:36.103 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:36.103 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:36.360 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:22:36.617 [ 00:22:36.617 { 00:22:36.617 "name": "NewBaseBdev", 00:22:36.617 "aliases": [ 00:22:36.617 "f16b78b9-3071-4d78-a063-799f41085277" 00:22:36.617 ], 00:22:36.617 "product_name": "Malloc disk", 00:22:36.617 "block_size": 512, 00:22:36.617 "num_blocks": 65536, 00:22:36.617 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:36.617 "assigned_rate_limits": { 00:22:36.617 "rw_ios_per_sec": 0, 00:22:36.617 "rw_mbytes_per_sec": 0, 00:22:36.617 "r_mbytes_per_sec": 0, 00:22:36.617 "w_mbytes_per_sec": 0 00:22:36.617 }, 00:22:36.617 "claimed": true, 00:22:36.617 "claim_type": "exclusive_write", 00:22:36.617 "zoned": false, 00:22:36.617 "supported_io_types": { 00:22:36.617 "read": true, 00:22:36.617 "write": true, 00:22:36.617 "unmap": true, 00:22:36.617 "flush": true, 00:22:36.617 "reset": true, 00:22:36.617 "nvme_admin": false, 00:22:36.617 "nvme_io": false, 00:22:36.617 "nvme_io_md": false, 00:22:36.617 "write_zeroes": true, 00:22:36.617 "zcopy": true, 00:22:36.617 "get_zone_info": false, 00:22:36.617 "zone_management": false, 00:22:36.617 "zone_append": false, 00:22:36.617 "compare": false, 00:22:36.617 "compare_and_write": false, 00:22:36.617 "abort": true, 00:22:36.617 "seek_hole": false, 00:22:36.617 "seek_data": false, 00:22:36.617 "copy": true, 00:22:36.617 "nvme_iov_md": false 00:22:36.617 }, 00:22:36.617 "memory_domains": [ 00:22:36.617 { 00:22:36.617 "dma_device_id": "system", 00:22:36.617 "dma_device_type": 1 00:22:36.617 }, 00:22:36.617 { 00:22:36.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.617 "dma_device_type": 2 00:22:36.617 } 00:22:36.617 ], 00:22:36.617 "driver_specific": {} 00:22:36.617 } 00:22:36.617 ] 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.617 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:36.874 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.874 "name": "Existed_Raid", 00:22:36.874 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:36.874 "strip_size_kb": 0, 00:22:36.874 "state": "online", 00:22:36.874 "raid_level": "raid1", 00:22:36.874 "superblock": true, 00:22:36.874 "num_base_bdevs": 4, 00:22:36.874 "num_base_bdevs_discovered": 4, 00:22:36.874 "num_base_bdevs_operational": 4, 00:22:36.874 "base_bdevs_list": [ 00:22:36.874 { 00:22:36.874 "name": "NewBaseBdev", 00:22:36.874 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:36.874 "is_configured": true, 00:22:36.874 "data_offset": 2048, 00:22:36.874 "data_size": 63488 00:22:36.874 }, 00:22:36.874 { 00:22:36.874 "name": "BaseBdev2", 00:22:36.874 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:36.874 "is_configured": true, 00:22:36.874 "data_offset": 2048, 00:22:36.874 "data_size": 63488 00:22:36.874 }, 00:22:36.874 { 00:22:36.874 "name": "BaseBdev3", 00:22:36.874 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:36.874 "is_configured": true, 00:22:36.874 "data_offset": 2048, 00:22:36.874 "data_size": 63488 00:22:36.874 }, 00:22:36.874 { 00:22:36.874 "name": "BaseBdev4", 00:22:36.874 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:36.874 "is_configured": true, 00:22:36.874 "data_offset": 2048, 00:22:36.874 "data_size": 63488 00:22:36.874 } 00:22:36.874 ] 00:22:36.874 }' 00:22:36.874 10:30:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.874 10:30:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:37.439 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:22:37.439 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:37.439 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:37.439 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:37.439 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:37.439 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:22:37.440 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:37.440 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:37.440 [2024-07-15 10:30:14.634257] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:37.698 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:37.698 "name": "Existed_Raid", 00:22:37.698 "aliases": [ 00:22:37.698 "1353a8a9-2c97-4ab6-b496-3919f306fed9" 00:22:37.698 ], 00:22:37.698 "product_name": "Raid Volume", 00:22:37.698 "block_size": 512, 00:22:37.698 "num_blocks": 63488, 00:22:37.698 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:37.698 "assigned_rate_limits": { 00:22:37.698 "rw_ios_per_sec": 0, 00:22:37.698 "rw_mbytes_per_sec": 0, 00:22:37.698 "r_mbytes_per_sec": 0, 00:22:37.698 "w_mbytes_per_sec": 0 00:22:37.698 }, 00:22:37.698 "claimed": false, 00:22:37.698 "zoned": false, 00:22:37.698 "supported_io_types": { 00:22:37.698 "read": true, 00:22:37.698 "write": true, 00:22:37.698 "unmap": false, 00:22:37.698 "flush": false, 00:22:37.698 "reset": true, 00:22:37.698 "nvme_admin": false, 00:22:37.698 "nvme_io": false, 00:22:37.698 "nvme_io_md": false, 00:22:37.698 "write_zeroes": true, 00:22:37.698 "zcopy": false, 00:22:37.698 "get_zone_info": false, 00:22:37.698 "zone_management": false, 00:22:37.698 "zone_append": false, 00:22:37.698 "compare": false, 00:22:37.698 "compare_and_write": false, 00:22:37.698 "abort": false, 00:22:37.698 "seek_hole": false, 00:22:37.698 "seek_data": false, 00:22:37.698 "copy": false, 00:22:37.698 "nvme_iov_md": false 00:22:37.698 }, 00:22:37.698 "memory_domains": [ 00:22:37.698 { 00:22:37.698 "dma_device_id": "system", 00:22:37.698 "dma_device_type": 1 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.698 "dma_device_type": 2 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "dma_device_id": "system", 00:22:37.698 "dma_device_type": 1 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.698 "dma_device_type": 2 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "dma_device_id": "system", 00:22:37.698 "dma_device_type": 1 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.698 "dma_device_type": 2 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "dma_device_id": "system", 00:22:37.698 "dma_device_type": 1 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.698 "dma_device_type": 2 00:22:37.698 } 00:22:37.698 ], 00:22:37.698 "driver_specific": { 00:22:37.698 "raid": { 00:22:37.698 "uuid": "1353a8a9-2c97-4ab6-b496-3919f306fed9", 00:22:37.698 "strip_size_kb": 0, 00:22:37.698 "state": "online", 00:22:37.698 "raid_level": "raid1", 00:22:37.698 "superblock": true, 00:22:37.698 "num_base_bdevs": 4, 00:22:37.698 "num_base_bdevs_discovered": 4, 00:22:37.698 "num_base_bdevs_operational": 4, 00:22:37.698 "base_bdevs_list": [ 00:22:37.698 { 00:22:37.698 "name": "NewBaseBdev", 00:22:37.698 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:37.698 "is_configured": true, 00:22:37.698 "data_offset": 2048, 00:22:37.698 "data_size": 63488 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "name": "BaseBdev2", 00:22:37.698 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:37.698 "is_configured": true, 00:22:37.698 "data_offset": 2048, 00:22:37.698 "data_size": 63488 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "name": "BaseBdev3", 00:22:37.698 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:37.698 "is_configured": true, 00:22:37.698 "data_offset": 2048, 00:22:37.698 "data_size": 63488 00:22:37.698 }, 00:22:37.698 { 00:22:37.698 "name": "BaseBdev4", 00:22:37.698 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:37.698 "is_configured": true, 00:22:37.698 "data_offset": 2048, 00:22:37.698 "data_size": 63488 00:22:37.698 } 00:22:37.698 ] 00:22:37.698 } 00:22:37.699 } 00:22:37.699 }' 00:22:37.699 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:37.699 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:22:37.699 BaseBdev2 00:22:37.699 BaseBdev3 00:22:37.699 BaseBdev4' 00:22:37.699 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:37.699 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:22:37.699 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:37.957 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:37.957 "name": "NewBaseBdev", 00:22:37.957 "aliases": [ 00:22:37.957 "f16b78b9-3071-4d78-a063-799f41085277" 00:22:37.957 ], 00:22:37.957 "product_name": "Malloc disk", 00:22:37.957 "block_size": 512, 00:22:37.957 "num_blocks": 65536, 00:22:37.957 "uuid": "f16b78b9-3071-4d78-a063-799f41085277", 00:22:37.957 "assigned_rate_limits": { 00:22:37.957 "rw_ios_per_sec": 0, 00:22:37.957 "rw_mbytes_per_sec": 0, 00:22:37.957 "r_mbytes_per_sec": 0, 00:22:37.957 "w_mbytes_per_sec": 0 00:22:37.957 }, 00:22:37.957 "claimed": true, 00:22:37.957 "claim_type": "exclusive_write", 00:22:37.957 "zoned": false, 00:22:37.957 "supported_io_types": { 00:22:37.957 "read": true, 00:22:37.957 "write": true, 00:22:37.957 "unmap": true, 00:22:37.957 "flush": true, 00:22:37.957 "reset": true, 00:22:37.957 "nvme_admin": false, 00:22:37.957 "nvme_io": false, 00:22:37.957 "nvme_io_md": false, 00:22:37.957 "write_zeroes": true, 00:22:37.957 "zcopy": true, 00:22:37.957 "get_zone_info": false, 00:22:37.957 "zone_management": false, 00:22:37.957 "zone_append": false, 00:22:37.957 "compare": false, 00:22:37.957 "compare_and_write": false, 00:22:37.957 "abort": true, 00:22:37.957 "seek_hole": false, 00:22:37.957 "seek_data": false, 00:22:37.957 "copy": true, 00:22:37.957 "nvme_iov_md": false 00:22:37.957 }, 00:22:37.957 "memory_domains": [ 00:22:37.957 { 00:22:37.957 "dma_device_id": "system", 00:22:37.957 "dma_device_type": 1 00:22:37.957 }, 00:22:37.957 { 00:22:37.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:37.957 "dma_device_type": 2 00:22:37.957 } 00:22:37.957 ], 00:22:37.957 "driver_specific": {} 00:22:37.957 }' 00:22:37.957 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:37.957 10:30:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:37.957 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:37.957 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:37.957 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:37.957 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:37.957 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:38.215 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:38.473 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:38.473 "name": "BaseBdev2", 00:22:38.473 "aliases": [ 00:22:38.473 "0209e759-be72-4f74-a472-267421e9bc9c" 00:22:38.473 ], 00:22:38.473 "product_name": "Malloc disk", 00:22:38.473 "block_size": 512, 00:22:38.473 "num_blocks": 65536, 00:22:38.473 "uuid": "0209e759-be72-4f74-a472-267421e9bc9c", 00:22:38.473 "assigned_rate_limits": { 00:22:38.473 "rw_ios_per_sec": 0, 00:22:38.473 "rw_mbytes_per_sec": 0, 00:22:38.473 "r_mbytes_per_sec": 0, 00:22:38.473 "w_mbytes_per_sec": 0 00:22:38.473 }, 00:22:38.473 "claimed": true, 00:22:38.473 "claim_type": "exclusive_write", 00:22:38.473 "zoned": false, 00:22:38.473 "supported_io_types": { 00:22:38.473 "read": true, 00:22:38.473 "write": true, 00:22:38.473 "unmap": true, 00:22:38.473 "flush": true, 00:22:38.473 "reset": true, 00:22:38.473 "nvme_admin": false, 00:22:38.473 "nvme_io": false, 00:22:38.473 "nvme_io_md": false, 00:22:38.473 "write_zeroes": true, 00:22:38.473 "zcopy": true, 00:22:38.473 "get_zone_info": false, 00:22:38.473 "zone_management": false, 00:22:38.473 "zone_append": false, 00:22:38.473 "compare": false, 00:22:38.473 "compare_and_write": false, 00:22:38.473 "abort": true, 00:22:38.473 "seek_hole": false, 00:22:38.473 "seek_data": false, 00:22:38.473 "copy": true, 00:22:38.473 "nvme_iov_md": false 00:22:38.473 }, 00:22:38.473 "memory_domains": [ 00:22:38.473 { 00:22:38.473 "dma_device_id": "system", 00:22:38.473 "dma_device_type": 1 00:22:38.473 }, 00:22:38.473 { 00:22:38.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:38.473 "dma_device_type": 2 00:22:38.473 } 00:22:38.473 ], 00:22:38.473 "driver_specific": {} 00:22:38.473 }' 00:22:38.473 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:38.473 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:38.473 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:38.473 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:38.731 10:30:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:38.990 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:38.990 "name": "BaseBdev3", 00:22:38.990 "aliases": [ 00:22:38.990 "c20cf15d-c0b6-4049-819b-34cc0269dd1a" 00:22:38.990 ], 00:22:38.990 "product_name": "Malloc disk", 00:22:38.990 "block_size": 512, 00:22:38.990 "num_blocks": 65536, 00:22:38.990 "uuid": "c20cf15d-c0b6-4049-819b-34cc0269dd1a", 00:22:38.990 "assigned_rate_limits": { 00:22:38.990 "rw_ios_per_sec": 0, 00:22:38.990 "rw_mbytes_per_sec": 0, 00:22:38.990 "r_mbytes_per_sec": 0, 00:22:38.990 "w_mbytes_per_sec": 0 00:22:38.990 }, 00:22:38.990 "claimed": true, 00:22:38.990 "claim_type": "exclusive_write", 00:22:38.990 "zoned": false, 00:22:38.990 "supported_io_types": { 00:22:38.990 "read": true, 00:22:38.990 "write": true, 00:22:38.990 "unmap": true, 00:22:38.990 "flush": true, 00:22:38.990 "reset": true, 00:22:38.990 "nvme_admin": false, 00:22:38.990 "nvme_io": false, 00:22:38.990 "nvme_io_md": false, 00:22:38.990 "write_zeroes": true, 00:22:38.990 "zcopy": true, 00:22:38.990 "get_zone_info": false, 00:22:38.990 "zone_management": false, 00:22:38.990 "zone_append": false, 00:22:38.990 "compare": false, 00:22:38.990 "compare_and_write": false, 00:22:38.990 "abort": true, 00:22:38.990 "seek_hole": false, 00:22:38.990 "seek_data": false, 00:22:38.990 "copy": true, 00:22:38.990 "nvme_iov_md": false 00:22:38.990 }, 00:22:38.990 "memory_domains": [ 00:22:38.990 { 00:22:38.990 "dma_device_id": "system", 00:22:38.990 "dma_device_type": 1 00:22:38.990 }, 00:22:38.990 { 00:22:38.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:38.990 "dma_device_type": 2 00:22:38.990 } 00:22:38.990 ], 00:22:38.990 "driver_specific": {} 00:22:38.990 }' 00:22:38.990 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:39.249 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:39.508 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:39.508 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:39.508 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:39.508 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:39.508 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:39.766 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:39.766 "name": "BaseBdev4", 00:22:39.766 "aliases": [ 00:22:39.766 "4071f189-8e09-413a-a384-3ad3cd185d19" 00:22:39.766 ], 00:22:39.766 "product_name": "Malloc disk", 00:22:39.766 "block_size": 512, 00:22:39.766 "num_blocks": 65536, 00:22:39.766 "uuid": "4071f189-8e09-413a-a384-3ad3cd185d19", 00:22:39.766 "assigned_rate_limits": { 00:22:39.766 "rw_ios_per_sec": 0, 00:22:39.766 "rw_mbytes_per_sec": 0, 00:22:39.766 "r_mbytes_per_sec": 0, 00:22:39.766 "w_mbytes_per_sec": 0 00:22:39.766 }, 00:22:39.766 "claimed": true, 00:22:39.766 "claim_type": "exclusive_write", 00:22:39.766 "zoned": false, 00:22:39.766 "supported_io_types": { 00:22:39.767 "read": true, 00:22:39.767 "write": true, 00:22:39.767 "unmap": true, 00:22:39.767 "flush": true, 00:22:39.767 "reset": true, 00:22:39.767 "nvme_admin": false, 00:22:39.767 "nvme_io": false, 00:22:39.767 "nvme_io_md": false, 00:22:39.767 "write_zeroes": true, 00:22:39.767 "zcopy": true, 00:22:39.767 "get_zone_info": false, 00:22:39.767 "zone_management": false, 00:22:39.767 "zone_append": false, 00:22:39.767 "compare": false, 00:22:39.767 "compare_and_write": false, 00:22:39.767 "abort": true, 00:22:39.767 "seek_hole": false, 00:22:39.767 "seek_data": false, 00:22:39.767 "copy": true, 00:22:39.767 "nvme_iov_md": false 00:22:39.767 }, 00:22:39.767 "memory_domains": [ 00:22:39.767 { 00:22:39.767 "dma_device_id": "system", 00:22:39.767 "dma_device_type": 1 00:22:39.767 }, 00:22:39.767 { 00:22:39.767 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:39.767 "dma_device_type": 2 00:22:39.767 } 00:22:39.767 ], 00:22:39.767 "driver_specific": {} 00:22:39.767 }' 00:22:39.767 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:39.767 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:39.767 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:39.767 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:39.767 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:39.767 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:39.767 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:40.025 10:30:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:40.025 10:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:40.025 10:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:40.025 10:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:40.025 10:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:40.025 10:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:40.284 [2024-07-15 10:30:17.325117] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:40.284 [2024-07-15 10:30:17.325143] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:40.284 [2024-07-15 10:30:17.325194] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:40.284 [2024-07-15 10:30:17.325481] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:40.284 [2024-07-15 10:30:17.325495] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1596180 name Existed_Raid, state offline 00:22:40.284 10:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 569049 00:22:40.284 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 569049 ']' 00:22:40.284 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 569049 00:22:40.284 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:40.285 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:40.285 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 569049 00:22:40.285 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:40.285 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:40.285 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 569049' 00:22:40.285 killing process with pid 569049 00:22:40.285 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 569049 00:22:40.285 [2024-07-15 10:30:17.391492] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:40.285 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 569049 00:22:40.285 [2024-07-15 10:30:17.427462] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:40.543 10:30:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:40.543 00:22:40.543 real 0m33.512s 00:22:40.543 user 1m1.589s 00:22:40.543 sys 0m5.906s 00:22:40.543 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:40.543 10:30:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:40.543 ************************************ 00:22:40.543 END TEST raid_state_function_test_sb 00:22:40.543 ************************************ 00:22:40.543 10:30:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:40.543 10:30:17 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:22:40.543 10:30:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:40.543 10:30:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:40.543 10:30:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:40.543 ************************************ 00:22:40.543 START TEST raid_superblock_test 00:22:40.543 ************************************ 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:40.543 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=574527 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 574527 /var/tmp/spdk-raid.sock 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 574527 ']' 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:40.544 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:40.544 10:30:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:40.803 [2024-07-15 10:30:17.772171] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:22:40.803 [2024-07-15 10:30:17.772235] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid574527 ] 00:22:40.803 [2024-07-15 10:30:17.898110] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:40.803 [2024-07-15 10:30:17.996364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:41.062 [2024-07-15 10:30:18.058234] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:41.062 [2024-07-15 10:30:18.058273] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:41.631 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:41.632 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:41.890 malloc1 00:22:41.890 10:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:42.149 [2024-07-15 10:30:19.172242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:42.149 [2024-07-15 10:30:19.172289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:42.149 [2024-07-15 10:30:19.172310] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d0570 00:22:42.149 [2024-07-15 10:30:19.172323] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:42.149 [2024-07-15 10:30:19.174072] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:42.149 [2024-07-15 10:30:19.174103] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:42.149 pt1 00:22:42.149 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:42.149 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:42.150 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:42.150 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:42.150 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:42.150 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:42.150 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:42.150 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:42.150 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:42.411 malloc2 00:22:42.411 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:42.671 [2024-07-15 10:30:19.658273] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:42.671 [2024-07-15 10:30:19.658319] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:42.671 [2024-07-15 10:30:19.658337] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d1970 00:22:42.671 [2024-07-15 10:30:19.658350] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:42.671 [2024-07-15 10:30:19.660002] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:42.671 [2024-07-15 10:30:19.660030] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:42.671 pt2 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:42.671 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:42.930 malloc3 00:22:42.930 10:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:43.189 [2024-07-15 10:30:20.161464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:43.189 [2024-07-15 10:30:20.161519] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:43.189 [2024-07-15 10:30:20.161544] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2268340 00:22:43.189 [2024-07-15 10:30:20.161557] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:43.189 [2024-07-15 10:30:20.163213] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:43.189 [2024-07-15 10:30:20.163242] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:43.189 pt3 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:43.189 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:43.448 malloc4 00:22:43.448 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:43.707 [2024-07-15 10:30:20.664749] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:43.707 [2024-07-15 10:30:20.664802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:43.707 [2024-07-15 10:30:20.664823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x226ac60 00:22:43.707 [2024-07-15 10:30:20.664835] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:43.707 [2024-07-15 10:30:20.666490] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:43.707 [2024-07-15 10:30:20.666520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:43.707 pt4 00:22:43.707 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:43.707 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:43.707 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:43.966 [2024-07-15 10:30:20.905404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:43.966 [2024-07-15 10:30:20.906749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:43.966 [2024-07-15 10:30:20.906806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:43.966 [2024-07-15 10:30:20.906850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:43.966 [2024-07-15 10:30:20.907034] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c8530 00:22:43.966 [2024-07-15 10:30:20.907046] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:43.966 [2024-07-15 10:30:20.907245] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c6770 00:22:43.966 [2024-07-15 10:30:20.907401] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c8530 00:22:43.966 [2024-07-15 10:30:20.907411] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c8530 00:22:43.966 [2024-07-15 10:30:20.907517] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.966 10:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.225 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.225 "name": "raid_bdev1", 00:22:44.225 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:44.225 "strip_size_kb": 0, 00:22:44.225 "state": "online", 00:22:44.225 "raid_level": "raid1", 00:22:44.225 "superblock": true, 00:22:44.225 "num_base_bdevs": 4, 00:22:44.225 "num_base_bdevs_discovered": 4, 00:22:44.225 "num_base_bdevs_operational": 4, 00:22:44.225 "base_bdevs_list": [ 00:22:44.225 { 00:22:44.225 "name": "pt1", 00:22:44.225 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:44.225 "is_configured": true, 00:22:44.225 "data_offset": 2048, 00:22:44.225 "data_size": 63488 00:22:44.225 }, 00:22:44.225 { 00:22:44.225 "name": "pt2", 00:22:44.225 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:44.225 "is_configured": true, 00:22:44.225 "data_offset": 2048, 00:22:44.225 "data_size": 63488 00:22:44.225 }, 00:22:44.225 { 00:22:44.225 "name": "pt3", 00:22:44.225 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:44.225 "is_configured": true, 00:22:44.225 "data_offset": 2048, 00:22:44.225 "data_size": 63488 00:22:44.225 }, 00:22:44.225 { 00:22:44.225 "name": "pt4", 00:22:44.225 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:44.225 "is_configured": true, 00:22:44.225 "data_offset": 2048, 00:22:44.225 "data_size": 63488 00:22:44.225 } 00:22:44.225 ] 00:22:44.225 }' 00:22:44.225 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.225 10:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:44.855 10:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:44.855 [2024-07-15 10:30:21.980521] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:44.855 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:44.855 "name": "raid_bdev1", 00:22:44.855 "aliases": [ 00:22:44.855 "46c8357a-a25d-4535-ab41-5b5c08d3f6fb" 00:22:44.855 ], 00:22:44.855 "product_name": "Raid Volume", 00:22:44.855 "block_size": 512, 00:22:44.855 "num_blocks": 63488, 00:22:44.855 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:44.855 "assigned_rate_limits": { 00:22:44.855 "rw_ios_per_sec": 0, 00:22:44.855 "rw_mbytes_per_sec": 0, 00:22:44.855 "r_mbytes_per_sec": 0, 00:22:44.855 "w_mbytes_per_sec": 0 00:22:44.855 }, 00:22:44.855 "claimed": false, 00:22:44.855 "zoned": false, 00:22:44.855 "supported_io_types": { 00:22:44.855 "read": true, 00:22:44.855 "write": true, 00:22:44.855 "unmap": false, 00:22:44.855 "flush": false, 00:22:44.855 "reset": true, 00:22:44.855 "nvme_admin": false, 00:22:44.855 "nvme_io": false, 00:22:44.855 "nvme_io_md": false, 00:22:44.855 "write_zeroes": true, 00:22:44.855 "zcopy": false, 00:22:44.855 "get_zone_info": false, 00:22:44.855 "zone_management": false, 00:22:44.855 "zone_append": false, 00:22:44.855 "compare": false, 00:22:44.855 "compare_and_write": false, 00:22:44.855 "abort": false, 00:22:44.855 "seek_hole": false, 00:22:44.855 "seek_data": false, 00:22:44.855 "copy": false, 00:22:44.855 "nvme_iov_md": false 00:22:44.855 }, 00:22:44.855 "memory_domains": [ 00:22:44.855 { 00:22:44.855 "dma_device_id": "system", 00:22:44.855 "dma_device_type": 1 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.855 "dma_device_type": 2 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "dma_device_id": "system", 00:22:44.855 "dma_device_type": 1 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.855 "dma_device_type": 2 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "dma_device_id": "system", 00:22:44.855 "dma_device_type": 1 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.855 "dma_device_type": 2 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "dma_device_id": "system", 00:22:44.855 "dma_device_type": 1 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.855 "dma_device_type": 2 00:22:44.855 } 00:22:44.855 ], 00:22:44.855 "driver_specific": { 00:22:44.855 "raid": { 00:22:44.855 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:44.855 "strip_size_kb": 0, 00:22:44.855 "state": "online", 00:22:44.855 "raid_level": "raid1", 00:22:44.855 "superblock": true, 00:22:44.855 "num_base_bdevs": 4, 00:22:44.855 "num_base_bdevs_discovered": 4, 00:22:44.855 "num_base_bdevs_operational": 4, 00:22:44.855 "base_bdevs_list": [ 00:22:44.855 { 00:22:44.855 "name": "pt1", 00:22:44.855 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:44.855 "is_configured": true, 00:22:44.855 "data_offset": 2048, 00:22:44.855 "data_size": 63488 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "name": "pt2", 00:22:44.855 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:44.855 "is_configured": true, 00:22:44.855 "data_offset": 2048, 00:22:44.855 "data_size": 63488 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "name": "pt3", 00:22:44.855 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:44.855 "is_configured": true, 00:22:44.855 "data_offset": 2048, 00:22:44.855 "data_size": 63488 00:22:44.855 }, 00:22:44.855 { 00:22:44.855 "name": "pt4", 00:22:44.855 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:44.855 "is_configured": true, 00:22:44.855 "data_offset": 2048, 00:22:44.855 "data_size": 63488 00:22:44.855 } 00:22:44.855 ] 00:22:44.855 } 00:22:44.855 } 00:22:44.855 }' 00:22:44.855 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:44.855 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:44.855 pt2 00:22:44.855 pt3 00:22:44.855 pt4' 00:22:44.855 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.115 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:45.115 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.115 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.115 "name": "pt1", 00:22:45.115 "aliases": [ 00:22:45.115 "00000000-0000-0000-0000-000000000001" 00:22:45.115 ], 00:22:45.115 "product_name": "passthru", 00:22:45.115 "block_size": 512, 00:22:45.115 "num_blocks": 65536, 00:22:45.115 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:45.115 "assigned_rate_limits": { 00:22:45.115 "rw_ios_per_sec": 0, 00:22:45.115 "rw_mbytes_per_sec": 0, 00:22:45.115 "r_mbytes_per_sec": 0, 00:22:45.115 "w_mbytes_per_sec": 0 00:22:45.115 }, 00:22:45.115 "claimed": true, 00:22:45.115 "claim_type": "exclusive_write", 00:22:45.115 "zoned": false, 00:22:45.115 "supported_io_types": { 00:22:45.115 "read": true, 00:22:45.115 "write": true, 00:22:45.115 "unmap": true, 00:22:45.115 "flush": true, 00:22:45.115 "reset": true, 00:22:45.115 "nvme_admin": false, 00:22:45.115 "nvme_io": false, 00:22:45.115 "nvme_io_md": false, 00:22:45.115 "write_zeroes": true, 00:22:45.115 "zcopy": true, 00:22:45.115 "get_zone_info": false, 00:22:45.115 "zone_management": false, 00:22:45.115 "zone_append": false, 00:22:45.115 "compare": false, 00:22:45.115 "compare_and_write": false, 00:22:45.115 "abort": true, 00:22:45.115 "seek_hole": false, 00:22:45.115 "seek_data": false, 00:22:45.115 "copy": true, 00:22:45.115 "nvme_iov_md": false 00:22:45.115 }, 00:22:45.115 "memory_domains": [ 00:22:45.115 { 00:22:45.115 "dma_device_id": "system", 00:22:45.115 "dma_device_type": 1 00:22:45.115 }, 00:22:45.115 { 00:22:45.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.115 "dma_device_type": 2 00:22:45.115 } 00:22:45.115 ], 00:22:45.115 "driver_specific": { 00:22:45.115 "passthru": { 00:22:45.115 "name": "pt1", 00:22:45.115 "base_bdev_name": "malloc1" 00:22:45.115 } 00:22:45.115 } 00:22:45.115 }' 00:22:45.115 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:45.373 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.632 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.632 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:45.632 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.632 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:45.632 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.891 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.891 "name": "pt2", 00:22:45.891 "aliases": [ 00:22:45.891 "00000000-0000-0000-0000-000000000002" 00:22:45.891 ], 00:22:45.891 "product_name": "passthru", 00:22:45.891 "block_size": 512, 00:22:45.891 "num_blocks": 65536, 00:22:45.891 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:45.891 "assigned_rate_limits": { 00:22:45.891 "rw_ios_per_sec": 0, 00:22:45.891 "rw_mbytes_per_sec": 0, 00:22:45.891 "r_mbytes_per_sec": 0, 00:22:45.891 "w_mbytes_per_sec": 0 00:22:45.891 }, 00:22:45.891 "claimed": true, 00:22:45.892 "claim_type": "exclusive_write", 00:22:45.892 "zoned": false, 00:22:45.892 "supported_io_types": { 00:22:45.892 "read": true, 00:22:45.892 "write": true, 00:22:45.892 "unmap": true, 00:22:45.892 "flush": true, 00:22:45.892 "reset": true, 00:22:45.892 "nvme_admin": false, 00:22:45.892 "nvme_io": false, 00:22:45.892 "nvme_io_md": false, 00:22:45.892 "write_zeroes": true, 00:22:45.892 "zcopy": true, 00:22:45.892 "get_zone_info": false, 00:22:45.892 "zone_management": false, 00:22:45.892 "zone_append": false, 00:22:45.892 "compare": false, 00:22:45.892 "compare_and_write": false, 00:22:45.892 "abort": true, 00:22:45.892 "seek_hole": false, 00:22:45.892 "seek_data": false, 00:22:45.892 "copy": true, 00:22:45.892 "nvme_iov_md": false 00:22:45.892 }, 00:22:45.892 "memory_domains": [ 00:22:45.892 { 00:22:45.892 "dma_device_id": "system", 00:22:45.892 "dma_device_type": 1 00:22:45.892 }, 00:22:45.892 { 00:22:45.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.892 "dma_device_type": 2 00:22:45.892 } 00:22:45.892 ], 00:22:45.892 "driver_specific": { 00:22:45.892 "passthru": { 00:22:45.892 "name": "pt2", 00:22:45.892 "base_bdev_name": "malloc2" 00:22:45.892 } 00:22:45.892 } 00:22:45.892 }' 00:22:45.892 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.892 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.892 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:45.892 10:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.892 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.892 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:45.892 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:46.150 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:46.407 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:46.407 "name": "pt3", 00:22:46.407 "aliases": [ 00:22:46.407 "00000000-0000-0000-0000-000000000003" 00:22:46.407 ], 00:22:46.407 "product_name": "passthru", 00:22:46.407 "block_size": 512, 00:22:46.407 "num_blocks": 65536, 00:22:46.407 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:46.407 "assigned_rate_limits": { 00:22:46.407 "rw_ios_per_sec": 0, 00:22:46.407 "rw_mbytes_per_sec": 0, 00:22:46.407 "r_mbytes_per_sec": 0, 00:22:46.407 "w_mbytes_per_sec": 0 00:22:46.407 }, 00:22:46.407 "claimed": true, 00:22:46.407 "claim_type": "exclusive_write", 00:22:46.407 "zoned": false, 00:22:46.407 "supported_io_types": { 00:22:46.407 "read": true, 00:22:46.407 "write": true, 00:22:46.407 "unmap": true, 00:22:46.407 "flush": true, 00:22:46.407 "reset": true, 00:22:46.407 "nvme_admin": false, 00:22:46.407 "nvme_io": false, 00:22:46.407 "nvme_io_md": false, 00:22:46.407 "write_zeroes": true, 00:22:46.407 "zcopy": true, 00:22:46.407 "get_zone_info": false, 00:22:46.407 "zone_management": false, 00:22:46.407 "zone_append": false, 00:22:46.407 "compare": false, 00:22:46.407 "compare_and_write": false, 00:22:46.407 "abort": true, 00:22:46.407 "seek_hole": false, 00:22:46.407 "seek_data": false, 00:22:46.407 "copy": true, 00:22:46.407 "nvme_iov_md": false 00:22:46.407 }, 00:22:46.407 "memory_domains": [ 00:22:46.407 { 00:22:46.407 "dma_device_id": "system", 00:22:46.407 "dma_device_type": 1 00:22:46.407 }, 00:22:46.407 { 00:22:46.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.407 "dma_device_type": 2 00:22:46.407 } 00:22:46.407 ], 00:22:46.407 "driver_specific": { 00:22:46.407 "passthru": { 00:22:46.407 "name": "pt3", 00:22:46.407 "base_bdev_name": "malloc3" 00:22:46.407 } 00:22:46.407 } 00:22:46.407 }' 00:22:46.407 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.407 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.407 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:46.407 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.664 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.664 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:46.664 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.664 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.665 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:46.665 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.665 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.665 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:46.665 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:46.665 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:46.665 10:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:46.923 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:46.923 "name": "pt4", 00:22:46.923 "aliases": [ 00:22:46.923 "00000000-0000-0000-0000-000000000004" 00:22:46.923 ], 00:22:46.923 "product_name": "passthru", 00:22:46.923 "block_size": 512, 00:22:46.923 "num_blocks": 65536, 00:22:46.923 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:46.923 "assigned_rate_limits": { 00:22:46.923 "rw_ios_per_sec": 0, 00:22:46.923 "rw_mbytes_per_sec": 0, 00:22:46.923 "r_mbytes_per_sec": 0, 00:22:46.923 "w_mbytes_per_sec": 0 00:22:46.923 }, 00:22:46.923 "claimed": true, 00:22:46.923 "claim_type": "exclusive_write", 00:22:46.923 "zoned": false, 00:22:46.923 "supported_io_types": { 00:22:46.923 "read": true, 00:22:46.923 "write": true, 00:22:46.923 "unmap": true, 00:22:46.923 "flush": true, 00:22:46.923 "reset": true, 00:22:46.923 "nvme_admin": false, 00:22:46.923 "nvme_io": false, 00:22:46.923 "nvme_io_md": false, 00:22:46.923 "write_zeroes": true, 00:22:46.923 "zcopy": true, 00:22:46.923 "get_zone_info": false, 00:22:46.923 "zone_management": false, 00:22:46.923 "zone_append": false, 00:22:46.923 "compare": false, 00:22:46.923 "compare_and_write": false, 00:22:46.923 "abort": true, 00:22:46.923 "seek_hole": false, 00:22:46.923 "seek_data": false, 00:22:46.923 "copy": true, 00:22:46.923 "nvme_iov_md": false 00:22:46.923 }, 00:22:46.923 "memory_domains": [ 00:22:46.923 { 00:22:46.923 "dma_device_id": "system", 00:22:46.923 "dma_device_type": 1 00:22:46.923 }, 00:22:46.923 { 00:22:46.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:46.923 "dma_device_type": 2 00:22:46.923 } 00:22:46.923 ], 00:22:46.923 "driver_specific": { 00:22:46.923 "passthru": { 00:22:46.923 "name": "pt4", 00:22:46.923 "base_bdev_name": "malloc4" 00:22:46.923 } 00:22:46.923 } 00:22:46.923 }' 00:22:46.923 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.923 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:46.923 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:46.923 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:47.181 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:47.439 [2024-07-15 10:30:24.515240] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:47.439 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=46c8357a-a25d-4535-ab41-5b5c08d3f6fb 00:22:47.439 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 46c8357a-a25d-4535-ab41-5b5c08d3f6fb ']' 00:22:47.439 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:47.697 [2024-07-15 10:30:24.763590] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:47.697 [2024-07-15 10:30:24.763614] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:47.697 [2024-07-15 10:30:24.763667] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:47.697 [2024-07-15 10:30:24.763755] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:47.697 [2024-07-15 10:30:24.763768] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c8530 name raid_bdev1, state offline 00:22:47.697 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.697 10:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:47.955 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:47.955 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:47.955 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:47.955 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:48.213 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:48.213 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:48.471 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:48.471 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:48.730 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:48.730 10:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:48.988 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:48.988 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:49.246 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:49.505 [2024-07-15 10:30:26.504129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:49.505 [2024-07-15 10:30:26.505521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:49.505 [2024-07-15 10:30:26.505565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:49.505 [2024-07-15 10:30:26.505598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:49.505 [2024-07-15 10:30:26.505643] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:49.505 [2024-07-15 10:30:26.505683] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:49.505 [2024-07-15 10:30:26.505706] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:49.505 [2024-07-15 10:30:26.505728] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:49.505 [2024-07-15 10:30:26.505746] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:49.505 [2024-07-15 10:30:26.505756] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2273ff0 name raid_bdev1, state configuring 00:22:49.505 request: 00:22:49.505 { 00:22:49.505 "name": "raid_bdev1", 00:22:49.505 "raid_level": "raid1", 00:22:49.505 "base_bdevs": [ 00:22:49.505 "malloc1", 00:22:49.505 "malloc2", 00:22:49.505 "malloc3", 00:22:49.505 "malloc4" 00:22:49.505 ], 00:22:49.505 "superblock": false, 00:22:49.505 "method": "bdev_raid_create", 00:22:49.505 "req_id": 1 00:22:49.505 } 00:22:49.505 Got JSON-RPC error response 00:22:49.505 response: 00:22:49.505 { 00:22:49.505 "code": -17, 00:22:49.505 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:49.505 } 00:22:49.505 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:49.505 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:49.505 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:49.505 10:30:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:49.505 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.505 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:49.763 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:49.763 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:49.763 10:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:50.021 [2024-07-15 10:30:26.997381] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:50.021 [2024-07-15 10:30:26.997431] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:50.021 [2024-07-15 10:30:26.997453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d07a0 00:22:50.021 [2024-07-15 10:30:26.997465] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:50.021 [2024-07-15 10:30:26.999103] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:50.021 [2024-07-15 10:30:26.999132] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:50.021 [2024-07-15 10:30:26.999203] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:50.021 [2024-07-15 10:30:26.999229] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:50.021 pt1 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.021 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.279 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.279 "name": "raid_bdev1", 00:22:50.279 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:50.279 "strip_size_kb": 0, 00:22:50.279 "state": "configuring", 00:22:50.279 "raid_level": "raid1", 00:22:50.279 "superblock": true, 00:22:50.279 "num_base_bdevs": 4, 00:22:50.279 "num_base_bdevs_discovered": 1, 00:22:50.279 "num_base_bdevs_operational": 4, 00:22:50.279 "base_bdevs_list": [ 00:22:50.279 { 00:22:50.279 "name": "pt1", 00:22:50.279 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:50.279 "is_configured": true, 00:22:50.279 "data_offset": 2048, 00:22:50.279 "data_size": 63488 00:22:50.279 }, 00:22:50.279 { 00:22:50.279 "name": null, 00:22:50.279 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:50.279 "is_configured": false, 00:22:50.279 "data_offset": 2048, 00:22:50.279 "data_size": 63488 00:22:50.279 }, 00:22:50.279 { 00:22:50.279 "name": null, 00:22:50.279 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:50.279 "is_configured": false, 00:22:50.279 "data_offset": 2048, 00:22:50.279 "data_size": 63488 00:22:50.279 }, 00:22:50.279 { 00:22:50.279 "name": null, 00:22:50.279 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:50.279 "is_configured": false, 00:22:50.279 "data_offset": 2048, 00:22:50.279 "data_size": 63488 00:22:50.279 } 00:22:50.279 ] 00:22:50.279 }' 00:22:50.279 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.279 10:30:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:50.845 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:50.845 10:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:51.104 [2024-07-15 10:30:28.100329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:51.104 [2024-07-15 10:30:28.100379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:51.104 [2024-07-15 10:30:28.100398] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2269940 00:22:51.104 [2024-07-15 10:30:28.100411] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:51.104 [2024-07-15 10:30:28.100748] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:51.104 [2024-07-15 10:30:28.100768] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:51.104 [2024-07-15 10:30:28.100831] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:51.104 [2024-07-15 10:30:28.100851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:51.104 pt2 00:22:51.104 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:51.362 [2024-07-15 10:30:28.344992] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.362 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.620 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:51.620 "name": "raid_bdev1", 00:22:51.620 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:51.620 "strip_size_kb": 0, 00:22:51.620 "state": "configuring", 00:22:51.620 "raid_level": "raid1", 00:22:51.620 "superblock": true, 00:22:51.620 "num_base_bdevs": 4, 00:22:51.620 "num_base_bdevs_discovered": 1, 00:22:51.620 "num_base_bdevs_operational": 4, 00:22:51.620 "base_bdevs_list": [ 00:22:51.621 { 00:22:51.621 "name": "pt1", 00:22:51.621 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:51.621 "is_configured": true, 00:22:51.621 "data_offset": 2048, 00:22:51.621 "data_size": 63488 00:22:51.621 }, 00:22:51.621 { 00:22:51.621 "name": null, 00:22:51.621 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:51.621 "is_configured": false, 00:22:51.621 "data_offset": 2048, 00:22:51.621 "data_size": 63488 00:22:51.621 }, 00:22:51.621 { 00:22:51.621 "name": null, 00:22:51.621 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:51.621 "is_configured": false, 00:22:51.621 "data_offset": 2048, 00:22:51.621 "data_size": 63488 00:22:51.621 }, 00:22:51.621 { 00:22:51.621 "name": null, 00:22:51.621 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:51.621 "is_configured": false, 00:22:51.621 "data_offset": 2048, 00:22:51.621 "data_size": 63488 00:22:51.621 } 00:22:51.621 ] 00:22:51.621 }' 00:22:51.621 10:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:51.621 10:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:52.186 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:52.186 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:52.186 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:52.444 [2024-07-15 10:30:29.439908] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:52.444 [2024-07-15 10:30:29.439977] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.444 [2024-07-15 10:30:29.439997] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c7060 00:22:52.444 [2024-07-15 10:30:29.440010] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.444 [2024-07-15 10:30:29.440350] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.444 [2024-07-15 10:30:29.440369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:52.444 [2024-07-15 10:30:29.440435] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:52.444 [2024-07-15 10:30:29.440454] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:52.444 pt2 00:22:52.444 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:52.444 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:52.444 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:52.702 [2024-07-15 10:30:29.684556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:52.702 [2024-07-15 10:30:29.684594] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.702 [2024-07-15 10:30:29.684613] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c98d0 00:22:52.702 [2024-07-15 10:30:29.684626] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.702 [2024-07-15 10:30:29.684918] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.702 [2024-07-15 10:30:29.684950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:52.702 [2024-07-15 10:30:29.685009] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:52.702 [2024-07-15 10:30:29.685028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:52.702 pt3 00:22:52.702 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:52.702 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:52.702 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:52.960 [2024-07-15 10:30:29.937230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:52.960 [2024-07-15 10:30:29.937269] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.960 [2024-07-15 10:30:29.937285] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20cab80 00:22:52.960 [2024-07-15 10:30:29.937297] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.960 [2024-07-15 10:30:29.937578] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.960 [2024-07-15 10:30:29.937596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:52.960 [2024-07-15 10:30:29.937645] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:52.960 [2024-07-15 10:30:29.937662] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:52.960 [2024-07-15 10:30:29.937779] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c7780 00:22:52.960 [2024-07-15 10:30:29.937790] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:52.960 [2024-07-15 10:30:29.937966] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ccfa0 00:22:52.960 [2024-07-15 10:30:29.938100] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c7780 00:22:52.960 [2024-07-15 10:30:29.938110] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c7780 00:22:52.960 [2024-07-15 10:30:29.938215] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:52.960 pt4 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.960 10:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.218 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.218 "name": "raid_bdev1", 00:22:53.218 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:53.218 "strip_size_kb": 0, 00:22:53.218 "state": "online", 00:22:53.218 "raid_level": "raid1", 00:22:53.218 "superblock": true, 00:22:53.218 "num_base_bdevs": 4, 00:22:53.218 "num_base_bdevs_discovered": 4, 00:22:53.218 "num_base_bdevs_operational": 4, 00:22:53.218 "base_bdevs_list": [ 00:22:53.218 { 00:22:53.218 "name": "pt1", 00:22:53.218 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:53.218 "is_configured": true, 00:22:53.218 "data_offset": 2048, 00:22:53.218 "data_size": 63488 00:22:53.218 }, 00:22:53.218 { 00:22:53.218 "name": "pt2", 00:22:53.218 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:53.218 "is_configured": true, 00:22:53.218 "data_offset": 2048, 00:22:53.218 "data_size": 63488 00:22:53.218 }, 00:22:53.218 { 00:22:53.218 "name": "pt3", 00:22:53.218 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:53.218 "is_configured": true, 00:22:53.218 "data_offset": 2048, 00:22:53.218 "data_size": 63488 00:22:53.218 }, 00:22:53.218 { 00:22:53.218 "name": "pt4", 00:22:53.218 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:53.218 "is_configured": true, 00:22:53.218 "data_offset": 2048, 00:22:53.218 "data_size": 63488 00:22:53.218 } 00:22:53.218 ] 00:22:53.218 }' 00:22:53.218 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.218 10:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:53.784 10:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:54.042 [2024-07-15 10:30:31.020435] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:54.042 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:54.042 "name": "raid_bdev1", 00:22:54.042 "aliases": [ 00:22:54.042 "46c8357a-a25d-4535-ab41-5b5c08d3f6fb" 00:22:54.042 ], 00:22:54.042 "product_name": "Raid Volume", 00:22:54.042 "block_size": 512, 00:22:54.042 "num_blocks": 63488, 00:22:54.042 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:54.042 "assigned_rate_limits": { 00:22:54.042 "rw_ios_per_sec": 0, 00:22:54.042 "rw_mbytes_per_sec": 0, 00:22:54.042 "r_mbytes_per_sec": 0, 00:22:54.042 "w_mbytes_per_sec": 0 00:22:54.042 }, 00:22:54.042 "claimed": false, 00:22:54.042 "zoned": false, 00:22:54.042 "supported_io_types": { 00:22:54.042 "read": true, 00:22:54.042 "write": true, 00:22:54.042 "unmap": false, 00:22:54.042 "flush": false, 00:22:54.042 "reset": true, 00:22:54.042 "nvme_admin": false, 00:22:54.042 "nvme_io": false, 00:22:54.042 "nvme_io_md": false, 00:22:54.042 "write_zeroes": true, 00:22:54.042 "zcopy": false, 00:22:54.042 "get_zone_info": false, 00:22:54.042 "zone_management": false, 00:22:54.042 "zone_append": false, 00:22:54.042 "compare": false, 00:22:54.042 "compare_and_write": false, 00:22:54.042 "abort": false, 00:22:54.042 "seek_hole": false, 00:22:54.042 "seek_data": false, 00:22:54.042 "copy": false, 00:22:54.042 "nvme_iov_md": false 00:22:54.042 }, 00:22:54.042 "memory_domains": [ 00:22:54.042 { 00:22:54.042 "dma_device_id": "system", 00:22:54.042 "dma_device_type": 1 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.042 "dma_device_type": 2 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "dma_device_id": "system", 00:22:54.042 "dma_device_type": 1 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.042 "dma_device_type": 2 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "dma_device_id": "system", 00:22:54.042 "dma_device_type": 1 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.042 "dma_device_type": 2 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "dma_device_id": "system", 00:22:54.042 "dma_device_type": 1 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.042 "dma_device_type": 2 00:22:54.042 } 00:22:54.042 ], 00:22:54.042 "driver_specific": { 00:22:54.042 "raid": { 00:22:54.042 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:54.042 "strip_size_kb": 0, 00:22:54.042 "state": "online", 00:22:54.042 "raid_level": "raid1", 00:22:54.042 "superblock": true, 00:22:54.042 "num_base_bdevs": 4, 00:22:54.042 "num_base_bdevs_discovered": 4, 00:22:54.042 "num_base_bdevs_operational": 4, 00:22:54.042 "base_bdevs_list": [ 00:22:54.042 { 00:22:54.042 "name": "pt1", 00:22:54.042 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:54.042 "is_configured": true, 00:22:54.042 "data_offset": 2048, 00:22:54.042 "data_size": 63488 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "name": "pt2", 00:22:54.042 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:54.042 "is_configured": true, 00:22:54.042 "data_offset": 2048, 00:22:54.042 "data_size": 63488 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "name": "pt3", 00:22:54.042 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:54.042 "is_configured": true, 00:22:54.042 "data_offset": 2048, 00:22:54.042 "data_size": 63488 00:22:54.042 }, 00:22:54.042 { 00:22:54.042 "name": "pt4", 00:22:54.042 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:54.042 "is_configured": true, 00:22:54.042 "data_offset": 2048, 00:22:54.042 "data_size": 63488 00:22:54.042 } 00:22:54.042 ] 00:22:54.042 } 00:22:54.042 } 00:22:54.042 }' 00:22:54.042 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:54.042 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:54.042 pt2 00:22:54.042 pt3 00:22:54.042 pt4' 00:22:54.042 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:54.042 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:54.042 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:54.299 "name": "pt1", 00:22:54.299 "aliases": [ 00:22:54.299 "00000000-0000-0000-0000-000000000001" 00:22:54.299 ], 00:22:54.299 "product_name": "passthru", 00:22:54.299 "block_size": 512, 00:22:54.299 "num_blocks": 65536, 00:22:54.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:54.299 "assigned_rate_limits": { 00:22:54.299 "rw_ios_per_sec": 0, 00:22:54.299 "rw_mbytes_per_sec": 0, 00:22:54.299 "r_mbytes_per_sec": 0, 00:22:54.299 "w_mbytes_per_sec": 0 00:22:54.299 }, 00:22:54.299 "claimed": true, 00:22:54.299 "claim_type": "exclusive_write", 00:22:54.299 "zoned": false, 00:22:54.299 "supported_io_types": { 00:22:54.299 "read": true, 00:22:54.299 "write": true, 00:22:54.299 "unmap": true, 00:22:54.299 "flush": true, 00:22:54.299 "reset": true, 00:22:54.299 "nvme_admin": false, 00:22:54.299 "nvme_io": false, 00:22:54.299 "nvme_io_md": false, 00:22:54.299 "write_zeroes": true, 00:22:54.299 "zcopy": true, 00:22:54.299 "get_zone_info": false, 00:22:54.299 "zone_management": false, 00:22:54.299 "zone_append": false, 00:22:54.299 "compare": false, 00:22:54.299 "compare_and_write": false, 00:22:54.299 "abort": true, 00:22:54.299 "seek_hole": false, 00:22:54.299 "seek_data": false, 00:22:54.299 "copy": true, 00:22:54.299 "nvme_iov_md": false 00:22:54.299 }, 00:22:54.299 "memory_domains": [ 00:22:54.299 { 00:22:54.299 "dma_device_id": "system", 00:22:54.299 "dma_device_type": 1 00:22:54.299 }, 00:22:54.299 { 00:22:54.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.299 "dma_device_type": 2 00:22:54.299 } 00:22:54.299 ], 00:22:54.299 "driver_specific": { 00:22:54.299 "passthru": { 00:22:54.299 "name": "pt1", 00:22:54.299 "base_bdev_name": "malloc1" 00:22:54.299 } 00:22:54.299 } 00:22:54.299 }' 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:54.299 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:54.556 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:54.813 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:54.813 "name": "pt2", 00:22:54.813 "aliases": [ 00:22:54.813 "00000000-0000-0000-0000-000000000002" 00:22:54.813 ], 00:22:54.813 "product_name": "passthru", 00:22:54.813 "block_size": 512, 00:22:54.813 "num_blocks": 65536, 00:22:54.813 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:54.813 "assigned_rate_limits": { 00:22:54.813 "rw_ios_per_sec": 0, 00:22:54.813 "rw_mbytes_per_sec": 0, 00:22:54.813 "r_mbytes_per_sec": 0, 00:22:54.813 "w_mbytes_per_sec": 0 00:22:54.813 }, 00:22:54.813 "claimed": true, 00:22:54.813 "claim_type": "exclusive_write", 00:22:54.813 "zoned": false, 00:22:54.813 "supported_io_types": { 00:22:54.813 "read": true, 00:22:54.813 "write": true, 00:22:54.813 "unmap": true, 00:22:54.813 "flush": true, 00:22:54.813 "reset": true, 00:22:54.813 "nvme_admin": false, 00:22:54.813 "nvme_io": false, 00:22:54.813 "nvme_io_md": false, 00:22:54.813 "write_zeroes": true, 00:22:54.813 "zcopy": true, 00:22:54.813 "get_zone_info": false, 00:22:54.813 "zone_management": false, 00:22:54.813 "zone_append": false, 00:22:54.813 "compare": false, 00:22:54.813 "compare_and_write": false, 00:22:54.813 "abort": true, 00:22:54.813 "seek_hole": false, 00:22:54.813 "seek_data": false, 00:22:54.813 "copy": true, 00:22:54.813 "nvme_iov_md": false 00:22:54.813 }, 00:22:54.813 "memory_domains": [ 00:22:54.813 { 00:22:54.813 "dma_device_id": "system", 00:22:54.813 "dma_device_type": 1 00:22:54.813 }, 00:22:54.813 { 00:22:54.813 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:54.813 "dma_device_type": 2 00:22:54.813 } 00:22:54.813 ], 00:22:54.813 "driver_specific": { 00:22:54.813 "passthru": { 00:22:54.813 "name": "pt2", 00:22:54.813 "base_bdev_name": "malloc2" 00:22:54.813 } 00:22:54.813 } 00:22:54.813 }' 00:22:54.813 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.813 10:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:54.813 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:54.813 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:55.071 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:55.330 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:55.330 "name": "pt3", 00:22:55.330 "aliases": [ 00:22:55.330 "00000000-0000-0000-0000-000000000003" 00:22:55.330 ], 00:22:55.330 "product_name": "passthru", 00:22:55.330 "block_size": 512, 00:22:55.330 "num_blocks": 65536, 00:22:55.330 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:55.330 "assigned_rate_limits": { 00:22:55.330 "rw_ios_per_sec": 0, 00:22:55.330 "rw_mbytes_per_sec": 0, 00:22:55.330 "r_mbytes_per_sec": 0, 00:22:55.330 "w_mbytes_per_sec": 0 00:22:55.330 }, 00:22:55.330 "claimed": true, 00:22:55.330 "claim_type": "exclusive_write", 00:22:55.330 "zoned": false, 00:22:55.330 "supported_io_types": { 00:22:55.330 "read": true, 00:22:55.330 "write": true, 00:22:55.330 "unmap": true, 00:22:55.330 "flush": true, 00:22:55.330 "reset": true, 00:22:55.330 "nvme_admin": false, 00:22:55.330 "nvme_io": false, 00:22:55.330 "nvme_io_md": false, 00:22:55.330 "write_zeroes": true, 00:22:55.330 "zcopy": true, 00:22:55.330 "get_zone_info": false, 00:22:55.330 "zone_management": false, 00:22:55.330 "zone_append": false, 00:22:55.330 "compare": false, 00:22:55.330 "compare_and_write": false, 00:22:55.330 "abort": true, 00:22:55.330 "seek_hole": false, 00:22:55.330 "seek_data": false, 00:22:55.330 "copy": true, 00:22:55.330 "nvme_iov_md": false 00:22:55.330 }, 00:22:55.330 "memory_domains": [ 00:22:55.330 { 00:22:55.330 "dma_device_id": "system", 00:22:55.330 "dma_device_type": 1 00:22:55.330 }, 00:22:55.330 { 00:22:55.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.330 "dma_device_type": 2 00:22:55.330 } 00:22:55.330 ], 00:22:55.330 "driver_specific": { 00:22:55.330 "passthru": { 00:22:55.330 "name": "pt3", 00:22:55.330 "base_bdev_name": "malloc3" 00:22:55.330 } 00:22:55.330 } 00:22:55.330 }' 00:22:55.330 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:55.588 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.846 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:55.846 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:55.846 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:55.846 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:55.846 10:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:56.104 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:56.104 "name": "pt4", 00:22:56.104 "aliases": [ 00:22:56.104 "00000000-0000-0000-0000-000000000004" 00:22:56.104 ], 00:22:56.104 "product_name": "passthru", 00:22:56.104 "block_size": 512, 00:22:56.104 "num_blocks": 65536, 00:22:56.104 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:56.104 "assigned_rate_limits": { 00:22:56.104 "rw_ios_per_sec": 0, 00:22:56.104 "rw_mbytes_per_sec": 0, 00:22:56.104 "r_mbytes_per_sec": 0, 00:22:56.104 "w_mbytes_per_sec": 0 00:22:56.104 }, 00:22:56.104 "claimed": true, 00:22:56.104 "claim_type": "exclusive_write", 00:22:56.104 "zoned": false, 00:22:56.104 "supported_io_types": { 00:22:56.104 "read": true, 00:22:56.104 "write": true, 00:22:56.104 "unmap": true, 00:22:56.104 "flush": true, 00:22:56.104 "reset": true, 00:22:56.104 "nvme_admin": false, 00:22:56.104 "nvme_io": false, 00:22:56.104 "nvme_io_md": false, 00:22:56.104 "write_zeroes": true, 00:22:56.104 "zcopy": true, 00:22:56.104 "get_zone_info": false, 00:22:56.104 "zone_management": false, 00:22:56.104 "zone_append": false, 00:22:56.104 "compare": false, 00:22:56.104 "compare_and_write": false, 00:22:56.104 "abort": true, 00:22:56.104 "seek_hole": false, 00:22:56.104 "seek_data": false, 00:22:56.104 "copy": true, 00:22:56.104 "nvme_iov_md": false 00:22:56.104 }, 00:22:56.104 "memory_domains": [ 00:22:56.104 { 00:22:56.104 "dma_device_id": "system", 00:22:56.104 "dma_device_type": 1 00:22:56.104 }, 00:22:56.104 { 00:22:56.104 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.104 "dma_device_type": 2 00:22:56.104 } 00:22:56.104 ], 00:22:56.104 "driver_specific": { 00:22:56.104 "passthru": { 00:22:56.104 "name": "pt4", 00:22:56.104 "base_bdev_name": "malloc4" 00:22:56.104 } 00:22:56.104 } 00:22:56.104 }' 00:22:56.104 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:56.104 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:56.104 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:56.105 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:56.105 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:56.105 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:56.105 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:56.361 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:56.361 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:56.361 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:56.361 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:56.361 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:56.361 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:56.361 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:56.618 [2024-07-15 10:30:33.671572] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:56.618 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 46c8357a-a25d-4535-ab41-5b5c08d3f6fb '!=' 46c8357a-a25d-4535-ab41-5b5c08d3f6fb ']' 00:22:56.618 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:56.618 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:56.618 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:56.618 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:56.876 [2024-07-15 10:30:33.911951] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.876 10:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.133 10:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:57.133 "name": "raid_bdev1", 00:22:57.133 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:57.133 "strip_size_kb": 0, 00:22:57.133 "state": "online", 00:22:57.133 "raid_level": "raid1", 00:22:57.133 "superblock": true, 00:22:57.133 "num_base_bdevs": 4, 00:22:57.133 "num_base_bdevs_discovered": 3, 00:22:57.133 "num_base_bdevs_operational": 3, 00:22:57.133 "base_bdevs_list": [ 00:22:57.133 { 00:22:57.133 "name": null, 00:22:57.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.133 "is_configured": false, 00:22:57.133 "data_offset": 2048, 00:22:57.133 "data_size": 63488 00:22:57.133 }, 00:22:57.133 { 00:22:57.133 "name": "pt2", 00:22:57.133 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:57.133 "is_configured": true, 00:22:57.133 "data_offset": 2048, 00:22:57.133 "data_size": 63488 00:22:57.133 }, 00:22:57.133 { 00:22:57.133 "name": "pt3", 00:22:57.133 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:57.133 "is_configured": true, 00:22:57.133 "data_offset": 2048, 00:22:57.133 "data_size": 63488 00:22:57.133 }, 00:22:57.133 { 00:22:57.133 "name": "pt4", 00:22:57.133 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:57.133 "is_configured": true, 00:22:57.133 "data_offset": 2048, 00:22:57.133 "data_size": 63488 00:22:57.133 } 00:22:57.133 ] 00:22:57.133 }' 00:22:57.133 10:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:57.133 10:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:57.698 10:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:57.956 [2024-07-15 10:30:35.006818] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:57.956 [2024-07-15 10:30:35.006844] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:57.956 [2024-07-15 10:30:35.006895] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:57.956 [2024-07-15 10:30:35.006962] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:57.956 [2024-07-15 10:30:35.006974] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c7780 name raid_bdev1, state offline 00:22:57.956 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.956 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:58.214 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:58.214 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:58.214 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:58.214 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:58.214 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:58.500 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:58.500 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:58.500 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:58.759 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:58.759 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:58.759 10:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:59.017 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:59.017 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:59.017 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:59.017 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:59.017 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:59.276 [2024-07-15 10:30:36.229986] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:59.276 [2024-07-15 10:30:36.230032] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:59.276 [2024-07-15 10:30:36.230051] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x226a700 00:22:59.276 [2024-07-15 10:30:36.230064] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:59.276 [2024-07-15 10:30:36.231700] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:59.276 [2024-07-15 10:30:36.231731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:59.276 [2024-07-15 10:30:36.231796] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:59.276 [2024-07-15 10:30:36.231823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:59.276 pt2 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.276 "name": "raid_bdev1", 00:22:59.276 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:22:59.276 "strip_size_kb": 0, 00:22:59.276 "state": "configuring", 00:22:59.276 "raid_level": "raid1", 00:22:59.276 "superblock": true, 00:22:59.276 "num_base_bdevs": 4, 00:22:59.276 "num_base_bdevs_discovered": 1, 00:22:59.276 "num_base_bdevs_operational": 3, 00:22:59.276 "base_bdevs_list": [ 00:22:59.276 { 00:22:59.276 "name": null, 00:22:59.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.276 "is_configured": false, 00:22:59.276 "data_offset": 2048, 00:22:59.276 "data_size": 63488 00:22:59.276 }, 00:22:59.276 { 00:22:59.276 "name": "pt2", 00:22:59.276 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:59.276 "is_configured": true, 00:22:59.276 "data_offset": 2048, 00:22:59.276 "data_size": 63488 00:22:59.276 }, 00:22:59.276 { 00:22:59.276 "name": null, 00:22:59.276 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:59.276 "is_configured": false, 00:22:59.276 "data_offset": 2048, 00:22:59.276 "data_size": 63488 00:22:59.276 }, 00:22:59.276 { 00:22:59.276 "name": null, 00:22:59.276 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:59.276 "is_configured": false, 00:22:59.276 "data_offset": 2048, 00:22:59.276 "data_size": 63488 00:22:59.276 } 00:22:59.276 ] 00:22:59.276 }' 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.276 10:30:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:59.842 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:59.842 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:59.842 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:00.100 [2024-07-15 10:30:37.248700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:00.100 [2024-07-15 10:30:37.248751] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:00.100 [2024-07-15 10:30:37.248773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d0a10 00:23:00.100 [2024-07-15 10:30:37.248786] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:00.100 [2024-07-15 10:30:37.249141] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:00.100 [2024-07-15 10:30:37.249161] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:00.100 [2024-07-15 10:30:37.249227] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:00.100 [2024-07-15 10:30:37.249255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:00.100 pt3 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.100 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.358 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.358 "name": "raid_bdev1", 00:23:00.358 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:23:00.358 "strip_size_kb": 0, 00:23:00.358 "state": "configuring", 00:23:00.358 "raid_level": "raid1", 00:23:00.358 "superblock": true, 00:23:00.358 "num_base_bdevs": 4, 00:23:00.358 "num_base_bdevs_discovered": 2, 00:23:00.358 "num_base_bdevs_operational": 3, 00:23:00.358 "base_bdevs_list": [ 00:23:00.358 { 00:23:00.358 "name": null, 00:23:00.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.358 "is_configured": false, 00:23:00.358 "data_offset": 2048, 00:23:00.358 "data_size": 63488 00:23:00.358 }, 00:23:00.358 { 00:23:00.358 "name": "pt2", 00:23:00.358 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:00.358 "is_configured": true, 00:23:00.358 "data_offset": 2048, 00:23:00.358 "data_size": 63488 00:23:00.358 }, 00:23:00.358 { 00:23:00.358 "name": "pt3", 00:23:00.358 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:00.358 "is_configured": true, 00:23:00.358 "data_offset": 2048, 00:23:00.358 "data_size": 63488 00:23:00.358 }, 00:23:00.358 { 00:23:00.358 "name": null, 00:23:00.358 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:00.358 "is_configured": false, 00:23:00.358 "data_offset": 2048, 00:23:00.358 "data_size": 63488 00:23:00.358 } 00:23:00.358 ] 00:23:00.358 }' 00:23:00.358 10:30:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.358 10:30:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:00.923 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:23:00.923 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:23:00.923 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:23:00.924 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:01.182 [2024-07-15 10:30:38.343604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:01.182 [2024-07-15 10:30:38.343658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:01.182 [2024-07-15 10:30:38.343677] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2273520 00:23:01.182 [2024-07-15 10:30:38.343689] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:01.182 [2024-07-15 10:30:38.344047] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:01.182 [2024-07-15 10:30:38.344068] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:01.182 [2024-07-15 10:30:38.344135] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:01.182 [2024-07-15 10:30:38.344157] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:01.182 [2024-07-15 10:30:38.344271] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20c7ea0 00:23:01.182 [2024-07-15 10:30:38.344289] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:01.182 [2024-07-15 10:30:38.344456] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20cc600 00:23:01.182 [2024-07-15 10:30:38.344584] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20c7ea0 00:23:01.182 [2024-07-15 10:30:38.344594] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20c7ea0 00:23:01.182 [2024-07-15 10:30:38.344689] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:01.182 pt4 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.182 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.440 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.440 "name": "raid_bdev1", 00:23:01.440 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:23:01.440 "strip_size_kb": 0, 00:23:01.440 "state": "online", 00:23:01.440 "raid_level": "raid1", 00:23:01.440 "superblock": true, 00:23:01.440 "num_base_bdevs": 4, 00:23:01.440 "num_base_bdevs_discovered": 3, 00:23:01.440 "num_base_bdevs_operational": 3, 00:23:01.440 "base_bdevs_list": [ 00:23:01.440 { 00:23:01.440 "name": null, 00:23:01.440 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.440 "is_configured": false, 00:23:01.440 "data_offset": 2048, 00:23:01.440 "data_size": 63488 00:23:01.440 }, 00:23:01.440 { 00:23:01.440 "name": "pt2", 00:23:01.440 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:01.440 "is_configured": true, 00:23:01.440 "data_offset": 2048, 00:23:01.440 "data_size": 63488 00:23:01.440 }, 00:23:01.440 { 00:23:01.440 "name": "pt3", 00:23:01.440 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:01.440 "is_configured": true, 00:23:01.440 "data_offset": 2048, 00:23:01.440 "data_size": 63488 00:23:01.440 }, 00:23:01.440 { 00:23:01.440 "name": "pt4", 00:23:01.440 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:01.440 "is_configured": true, 00:23:01.440 "data_offset": 2048, 00:23:01.440 "data_size": 63488 00:23:01.440 } 00:23:01.440 ] 00:23:01.440 }' 00:23:01.440 10:30:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.440 10:30:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:02.038 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:02.296 [2024-07-15 10:30:39.406435] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:02.296 [2024-07-15 10:30:39.406465] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:02.296 [2024-07-15 10:30:39.406521] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:02.296 [2024-07-15 10:30:39.406590] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:02.296 [2024-07-15 10:30:39.406603] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c7ea0 name raid_bdev1, state offline 00:23:02.296 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.296 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:23:02.554 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:23:02.554 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:23:02.554 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:23:02.554 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:23:02.554 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:02.812 10:30:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:03.071 [2024-07-15 10:30:40.156394] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:03.071 [2024-07-15 10:30:40.156448] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:03.071 [2024-07-15 10:30:40.156468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2273520 00:23:03.071 [2024-07-15 10:30:40.156480] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:03.071 [2024-07-15 10:30:40.158137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:03.071 [2024-07-15 10:30:40.158168] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:03.071 [2024-07-15 10:30:40.158238] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:03.071 [2024-07-15 10:30:40.158265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:03.071 [2024-07-15 10:30:40.158370] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:03.071 [2024-07-15 10:30:40.158384] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:03.071 [2024-07-15 10:30:40.158398] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20c7060 name raid_bdev1, state configuring 00:23:03.071 [2024-07-15 10:30:40.158421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:03.071 [2024-07-15 10:30:40.158498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:03.071 pt1 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.071 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.329 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:03.329 "name": "raid_bdev1", 00:23:03.329 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:23:03.329 "strip_size_kb": 0, 00:23:03.329 "state": "configuring", 00:23:03.329 "raid_level": "raid1", 00:23:03.329 "superblock": true, 00:23:03.329 "num_base_bdevs": 4, 00:23:03.329 "num_base_bdevs_discovered": 2, 00:23:03.329 "num_base_bdevs_operational": 3, 00:23:03.329 "base_bdevs_list": [ 00:23:03.329 { 00:23:03.329 "name": null, 00:23:03.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:03.329 "is_configured": false, 00:23:03.329 "data_offset": 2048, 00:23:03.329 "data_size": 63488 00:23:03.329 }, 00:23:03.329 { 00:23:03.329 "name": "pt2", 00:23:03.329 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:03.329 "is_configured": true, 00:23:03.329 "data_offset": 2048, 00:23:03.329 "data_size": 63488 00:23:03.329 }, 00:23:03.329 { 00:23:03.329 "name": "pt3", 00:23:03.329 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:03.329 "is_configured": true, 00:23:03.329 "data_offset": 2048, 00:23:03.329 "data_size": 63488 00:23:03.329 }, 00:23:03.329 { 00:23:03.329 "name": null, 00:23:03.329 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:03.329 "is_configured": false, 00:23:03.329 "data_offset": 2048, 00:23:03.329 "data_size": 63488 00:23:03.329 } 00:23:03.329 ] 00:23:03.329 }' 00:23:03.329 10:30:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:03.329 10:30:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.895 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:23:03.895 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:04.151 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:23:04.151 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:04.409 [2024-07-15 10:30:41.491920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:04.409 [2024-07-15 10:30:41.491984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.409 [2024-07-15 10:30:41.492003] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20c7310 00:23:04.409 [2024-07-15 10:30:41.492016] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.409 [2024-07-15 10:30:41.492362] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.409 [2024-07-15 10:30:41.492381] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:04.409 [2024-07-15 10:30:41.492446] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:04.409 [2024-07-15 10:30:41.492466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:04.409 [2024-07-15 10:30:41.492576] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20cab40 00:23:04.409 [2024-07-15 10:30:41.492586] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:04.409 [2024-07-15 10:30:41.492758] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x226a990 00:23:04.409 [2024-07-15 10:30:41.492887] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20cab40 00:23:04.409 [2024-07-15 10:30:41.492897] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20cab40 00:23:04.409 [2024-07-15 10:30:41.493008] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:04.409 pt4 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.409 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.666 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:04.666 "name": "raid_bdev1", 00:23:04.666 "uuid": "46c8357a-a25d-4535-ab41-5b5c08d3f6fb", 00:23:04.666 "strip_size_kb": 0, 00:23:04.666 "state": "online", 00:23:04.666 "raid_level": "raid1", 00:23:04.666 "superblock": true, 00:23:04.666 "num_base_bdevs": 4, 00:23:04.666 "num_base_bdevs_discovered": 3, 00:23:04.666 "num_base_bdevs_operational": 3, 00:23:04.666 "base_bdevs_list": [ 00:23:04.666 { 00:23:04.666 "name": null, 00:23:04.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:04.666 "is_configured": false, 00:23:04.666 "data_offset": 2048, 00:23:04.666 "data_size": 63488 00:23:04.666 }, 00:23:04.666 { 00:23:04.666 "name": "pt2", 00:23:04.666 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:04.666 "is_configured": true, 00:23:04.666 "data_offset": 2048, 00:23:04.666 "data_size": 63488 00:23:04.666 }, 00:23:04.666 { 00:23:04.666 "name": "pt3", 00:23:04.666 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:04.666 "is_configured": true, 00:23:04.666 "data_offset": 2048, 00:23:04.666 "data_size": 63488 00:23:04.666 }, 00:23:04.666 { 00:23:04.666 "name": "pt4", 00:23:04.666 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:04.666 "is_configured": true, 00:23:04.666 "data_offset": 2048, 00:23:04.666 "data_size": 63488 00:23:04.666 } 00:23:04.666 ] 00:23:04.666 }' 00:23:04.666 10:30:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:04.666 10:30:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:05.231 10:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:05.231 10:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:05.490 10:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:23:05.490 10:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:05.490 10:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:23:05.748 [2024-07-15 10:30:42.779639] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:05.748 10:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 46c8357a-a25d-4535-ab41-5b5c08d3f6fb '!=' 46c8357a-a25d-4535-ab41-5b5c08d3f6fb ']' 00:23:05.748 10:30:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 574527 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 574527 ']' 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 574527 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 574527 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 574527' 00:23:05.749 killing process with pid 574527 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 574527 00:23:05.749 [2024-07-15 10:30:42.851248] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:05.749 [2024-07-15 10:30:42.851307] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:05.749 [2024-07-15 10:30:42.851379] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:05.749 [2024-07-15 10:30:42.851392] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20cab40 name raid_bdev1, state offline 00:23:05.749 10:30:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 574527 00:23:05.749 [2024-07-15 10:30:42.891234] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:06.007 10:30:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:23:06.007 00:23:06.007 real 0m25.402s 00:23:06.007 user 0m46.379s 00:23:06.007 sys 0m4.660s 00:23:06.007 10:30:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:06.007 10:30:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:06.007 ************************************ 00:23:06.007 END TEST raid_superblock_test 00:23:06.007 ************************************ 00:23:06.007 10:30:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:06.007 10:30:43 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:23:06.007 10:30:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:06.007 10:30:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:06.007 10:30:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:06.007 ************************************ 00:23:06.007 START TEST raid_read_error_test 00:23:06.007 ************************************ 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.O867h5K5Zm 00:23:06.007 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=578299 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 578299 /var/tmp/spdk-raid.sock 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 578299 ']' 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:06.265 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:06.265 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:06.265 [2024-07-15 10:30:43.263533] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:23:06.265 [2024-07-15 10:30:43.263596] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid578299 ] 00:23:06.265 [2024-07-15 10:30:43.381267] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:06.522 [2024-07-15 10:30:43.486644] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:06.522 [2024-07-15 10:30:43.557914] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:06.522 [2024-07-15 10:30:43.557959] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:06.781 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:06.781 10:30:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:06.781 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:06.781 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:06.781 BaseBdev1_malloc 00:23:07.039 10:30:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:07.039 true 00:23:07.039 10:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:07.297 [2024-07-15 10:30:44.369516] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:07.297 [2024-07-15 10:30:44.369565] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.297 [2024-07-15 10:30:44.369587] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24030d0 00:23:07.297 [2024-07-15 10:30:44.369600] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.297 [2024-07-15 10:30:44.371506] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.297 [2024-07-15 10:30:44.371538] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:07.297 BaseBdev1 00:23:07.297 10:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:07.297 10:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:07.555 BaseBdev2_malloc 00:23:07.555 10:30:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:08.121 true 00:23:08.121 10:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:08.121 [2024-07-15 10:30:45.281701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:08.121 [2024-07-15 10:30:45.281748] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.121 [2024-07-15 10:30:45.281770] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2407910 00:23:08.121 [2024-07-15 10:30:45.281788] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.121 [2024-07-15 10:30:45.283347] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.121 [2024-07-15 10:30:45.283376] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:08.121 BaseBdev2 00:23:08.121 10:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:08.121 10:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:08.379 BaseBdev3_malloc 00:23:08.379 10:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:08.637 true 00:23:08.637 10:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:08.895 [2024-07-15 10:30:45.917183] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:08.895 [2024-07-15 10:30:45.917228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.895 [2024-07-15 10:30:45.917249] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2409bd0 00:23:08.895 [2024-07-15 10:30:45.917262] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.895 [2024-07-15 10:30:45.918867] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.895 [2024-07-15 10:30:45.918897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:08.895 BaseBdev3 00:23:08.895 10:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:08.895 10:30:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:09.152 BaseBdev4_malloc 00:23:09.152 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:09.152 true 00:23:09.152 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:09.410 [2024-07-15 10:30:46.548691] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:09.410 [2024-07-15 10:30:46.548735] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.410 [2024-07-15 10:30:46.548757] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x240aaa0 00:23:09.410 [2024-07-15 10:30:46.548771] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.410 [2024-07-15 10:30:46.550358] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.410 [2024-07-15 10:30:46.550389] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:09.410 BaseBdev4 00:23:09.410 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:09.668 [2024-07-15 10:30:46.713156] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:09.668 [2024-07-15 10:30:46.714527] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:09.668 [2024-07-15 10:30:46.714596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:09.668 [2024-07-15 10:30:46.714656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:09.668 [2024-07-15 10:30:46.714888] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2404c20 00:23:09.668 [2024-07-15 10:30:46.714900] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:09.668 [2024-07-15 10:30:46.715109] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2259260 00:23:09.668 [2024-07-15 10:30:46.715265] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2404c20 00:23:09.668 [2024-07-15 10:30:46.715276] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2404c20 00:23:09.668 [2024-07-15 10:30:46.715385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.668 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.926 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:09.926 "name": "raid_bdev1", 00:23:09.926 "uuid": "f782a160-219e-48fd-bb2b-3df9261c8b0f", 00:23:09.926 "strip_size_kb": 0, 00:23:09.926 "state": "online", 00:23:09.926 "raid_level": "raid1", 00:23:09.926 "superblock": true, 00:23:09.926 "num_base_bdevs": 4, 00:23:09.926 "num_base_bdevs_discovered": 4, 00:23:09.926 "num_base_bdevs_operational": 4, 00:23:09.926 "base_bdevs_list": [ 00:23:09.926 { 00:23:09.926 "name": "BaseBdev1", 00:23:09.926 "uuid": "a7a4793d-e97f-5e78-90ea-c0c75a04ca6e", 00:23:09.926 "is_configured": true, 00:23:09.926 "data_offset": 2048, 00:23:09.926 "data_size": 63488 00:23:09.926 }, 00:23:09.926 { 00:23:09.926 "name": "BaseBdev2", 00:23:09.926 "uuid": "e9105fa9-3080-56dc-8ebe-c94b7a16354c", 00:23:09.926 "is_configured": true, 00:23:09.926 "data_offset": 2048, 00:23:09.926 "data_size": 63488 00:23:09.926 }, 00:23:09.926 { 00:23:09.926 "name": "BaseBdev3", 00:23:09.926 "uuid": "a86ff335-fa10-53b7-9c7c-31c9df3059a8", 00:23:09.926 "is_configured": true, 00:23:09.926 "data_offset": 2048, 00:23:09.926 "data_size": 63488 00:23:09.926 }, 00:23:09.926 { 00:23:09.926 "name": "BaseBdev4", 00:23:09.926 "uuid": "710c13d9-2d55-519e-a01b-184980a85c26", 00:23:09.926 "is_configured": true, 00:23:09.926 "data_offset": 2048, 00:23:09.926 "data_size": 63488 00:23:09.926 } 00:23:09.926 ] 00:23:09.926 }' 00:23:09.926 10:30:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:09.926 10:30:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:10.492 10:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:10.492 10:30:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:10.492 [2024-07-15 10:30:47.639888] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2258c60 00:23:11.427 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.686 10:30:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.945 10:30:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.945 "name": "raid_bdev1", 00:23:11.945 "uuid": "f782a160-219e-48fd-bb2b-3df9261c8b0f", 00:23:11.945 "strip_size_kb": 0, 00:23:11.945 "state": "online", 00:23:11.945 "raid_level": "raid1", 00:23:11.945 "superblock": true, 00:23:11.945 "num_base_bdevs": 4, 00:23:11.945 "num_base_bdevs_discovered": 4, 00:23:11.945 "num_base_bdevs_operational": 4, 00:23:11.945 "base_bdevs_list": [ 00:23:11.945 { 00:23:11.945 "name": "BaseBdev1", 00:23:11.945 "uuid": "a7a4793d-e97f-5e78-90ea-c0c75a04ca6e", 00:23:11.945 "is_configured": true, 00:23:11.945 "data_offset": 2048, 00:23:11.945 "data_size": 63488 00:23:11.945 }, 00:23:11.945 { 00:23:11.945 "name": "BaseBdev2", 00:23:11.945 "uuid": "e9105fa9-3080-56dc-8ebe-c94b7a16354c", 00:23:11.945 "is_configured": true, 00:23:11.945 "data_offset": 2048, 00:23:11.945 "data_size": 63488 00:23:11.945 }, 00:23:11.945 { 00:23:11.945 "name": "BaseBdev3", 00:23:11.945 "uuid": "a86ff335-fa10-53b7-9c7c-31c9df3059a8", 00:23:11.945 "is_configured": true, 00:23:11.945 "data_offset": 2048, 00:23:11.945 "data_size": 63488 00:23:11.945 }, 00:23:11.945 { 00:23:11.945 "name": "BaseBdev4", 00:23:11.945 "uuid": "710c13d9-2d55-519e-a01b-184980a85c26", 00:23:11.945 "is_configured": true, 00:23:11.945 "data_offset": 2048, 00:23:11.945 "data_size": 63488 00:23:11.945 } 00:23:11.945 ] 00:23:11.945 }' 00:23:11.945 10:30:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.945 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:12.514 10:30:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:12.830 [2024-07-15 10:30:49.859341] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:12.830 [2024-07-15 10:30:49.859379] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:12.830 [2024-07-15 10:30:49.862505] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:12.830 [2024-07-15 10:30:49.862543] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:12.830 [2024-07-15 10:30:49.862664] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:12.830 [2024-07-15 10:30:49.862676] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2404c20 name raid_bdev1, state offline 00:23:12.830 0 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 578299 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 578299 ']' 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 578299 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 578299 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 578299' 00:23:12.830 killing process with pid 578299 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 578299 00:23:12.830 [2024-07-15 10:30:49.925356] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:12.830 10:30:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 578299 00:23:12.830 [2024-07-15 10:30:49.956967] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:13.089 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.O867h5K5Zm 00:23:13.089 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:13.089 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:13.090 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:13.090 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:13.090 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:13.090 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:13.090 10:30:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:13.090 00:23:13.090 real 0m7.005s 00:23:13.090 user 0m11.559s 00:23:13.090 sys 0m1.270s 00:23:13.090 10:30:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:13.090 10:30:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:13.090 ************************************ 00:23:13.090 END TEST raid_read_error_test 00:23:13.090 ************************************ 00:23:13.090 10:30:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:13.090 10:30:50 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:23:13.090 10:30:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:13.090 10:30:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:13.090 10:30:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:13.090 ************************************ 00:23:13.090 START TEST raid_write_error_test 00:23:13.090 ************************************ 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:23:13.090 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.GSCzc1tprO 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=579283 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 579283 /var/tmp/spdk-raid.sock 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 579283 ']' 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:13.349 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:13.349 10:30:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:13.349 [2024-07-15 10:30:50.358503] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:23:13.349 [2024-07-15 10:30:50.358576] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid579283 ] 00:23:13.349 [2024-07-15 10:30:50.490551] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:13.607 [2024-07-15 10:30:50.593770] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:13.607 [2024-07-15 10:30:50.655805] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:13.607 [2024-07-15 10:30:50.655835] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:14.174 10:30:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:14.174 10:30:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:23:14.174 10:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:14.174 10:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:14.434 BaseBdev1_malloc 00:23:14.434 10:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:23:14.692 true 00:23:14.692 10:30:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:23:14.951 [2024-07-15 10:30:52.025907] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:23:14.951 [2024-07-15 10:30:52.025957] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:14.951 [2024-07-15 10:30:52.025978] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x146e0d0 00:23:14.951 [2024-07-15 10:30:52.025991] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:14.951 [2024-07-15 10:30:52.027773] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:14.951 [2024-07-15 10:30:52.027807] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:14.951 BaseBdev1 00:23:14.951 10:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:14.951 10:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:15.210 BaseBdev2_malloc 00:23:15.210 10:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:23:15.469 true 00:23:15.469 10:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:23:15.727 [2024-07-15 10:30:52.704262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:23:15.727 [2024-07-15 10:30:52.704307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:15.727 [2024-07-15 10:30:52.704326] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1472910 00:23:15.727 [2024-07-15 10:30:52.704338] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:15.727 [2024-07-15 10:30:52.705720] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:15.727 [2024-07-15 10:30:52.705747] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:15.727 BaseBdev2 00:23:15.727 10:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:15.728 10:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:15.987 BaseBdev3_malloc 00:23:15.987 10:30:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:23:15.987 true 00:23:15.987 10:30:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:23:16.245 [2024-07-15 10:30:53.302359] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:23:16.245 [2024-07-15 10:30:53.302402] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.245 [2024-07-15 10:30:53.302421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1474bd0 00:23:16.245 [2024-07-15 10:30:53.302434] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.245 [2024-07-15 10:30:53.303830] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.245 [2024-07-15 10:30:53.303858] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:16.245 BaseBdev3 00:23:16.245 10:30:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:23:16.245 10:30:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:16.503 BaseBdev4_malloc 00:23:16.503 10:30:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:23:16.503 true 00:23:16.503 10:30:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:23:16.761 [2024-07-15 10:30:53.840412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:23:16.761 [2024-07-15 10:30:53.840458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.761 [2024-07-15 10:30:53.840477] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1475aa0 00:23:16.761 [2024-07-15 10:30:53.840496] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.761 [2024-07-15 10:30:53.841965] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.761 [2024-07-15 10:30:53.841994] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:16.761 BaseBdev4 00:23:16.761 10:30:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:23:17.020 [2024-07-15 10:30:54.089103] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:17.020 [2024-07-15 10:30:54.090357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:17.020 [2024-07-15 10:30:54.090424] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:17.020 [2024-07-15 10:30:54.090484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:17.020 [2024-07-15 10:30:54.090709] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x146fc20 00:23:17.020 [2024-07-15 10:30:54.090720] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:17.020 [2024-07-15 10:30:54.090911] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12c4260 00:23:17.020 [2024-07-15 10:30:54.091074] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x146fc20 00:23:17.020 [2024-07-15 10:30:54.091084] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x146fc20 00:23:17.020 [2024-07-15 10:30:54.091185] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.020 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.279 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.279 "name": "raid_bdev1", 00:23:17.279 "uuid": "b1ab0ef9-3a0d-41db-9bac-ef688be6ad3b", 00:23:17.279 "strip_size_kb": 0, 00:23:17.279 "state": "online", 00:23:17.279 "raid_level": "raid1", 00:23:17.279 "superblock": true, 00:23:17.279 "num_base_bdevs": 4, 00:23:17.279 "num_base_bdevs_discovered": 4, 00:23:17.279 "num_base_bdevs_operational": 4, 00:23:17.279 "base_bdevs_list": [ 00:23:17.279 { 00:23:17.279 "name": "BaseBdev1", 00:23:17.279 "uuid": "eb9abcfa-f1c2-58ff-819e-594f0ea40de7", 00:23:17.279 "is_configured": true, 00:23:17.279 "data_offset": 2048, 00:23:17.279 "data_size": 63488 00:23:17.279 }, 00:23:17.279 { 00:23:17.279 "name": "BaseBdev2", 00:23:17.279 "uuid": "ac79261e-7348-5a56-b77a-c0d6c9ced9e3", 00:23:17.279 "is_configured": true, 00:23:17.279 "data_offset": 2048, 00:23:17.279 "data_size": 63488 00:23:17.279 }, 00:23:17.279 { 00:23:17.279 "name": "BaseBdev3", 00:23:17.279 "uuid": "d48038f5-97db-57ba-926f-c563aec58e9f", 00:23:17.279 "is_configured": true, 00:23:17.279 "data_offset": 2048, 00:23:17.279 "data_size": 63488 00:23:17.279 }, 00:23:17.279 { 00:23:17.279 "name": "BaseBdev4", 00:23:17.279 "uuid": "3ed2a904-3e55-5d3a-a066-ccbd001dcce5", 00:23:17.279 "is_configured": true, 00:23:17.279 "data_offset": 2048, 00:23:17.279 "data_size": 63488 00:23:17.279 } 00:23:17.279 ] 00:23:17.279 }' 00:23:17.279 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.279 10:30:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:17.845 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:23:17.845 10:30:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:17.845 [2024-07-15 10:30:54.895509] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12c3c60 00:23:18.779 10:30:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:23:19.037 [2024-07-15 10:30:56.021200] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:23:19.037 [2024-07-15 10:30:56.021255] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:19.037 [2024-07-15 10:30:56.021469] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x12c3c60 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:19.037 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.038 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.297 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.297 "name": "raid_bdev1", 00:23:19.297 "uuid": "b1ab0ef9-3a0d-41db-9bac-ef688be6ad3b", 00:23:19.297 "strip_size_kb": 0, 00:23:19.297 "state": "online", 00:23:19.297 "raid_level": "raid1", 00:23:19.297 "superblock": true, 00:23:19.297 "num_base_bdevs": 4, 00:23:19.297 "num_base_bdevs_discovered": 3, 00:23:19.297 "num_base_bdevs_operational": 3, 00:23:19.297 "base_bdevs_list": [ 00:23:19.297 { 00:23:19.297 "name": null, 00:23:19.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.297 "is_configured": false, 00:23:19.297 "data_offset": 2048, 00:23:19.297 "data_size": 63488 00:23:19.297 }, 00:23:19.297 { 00:23:19.297 "name": "BaseBdev2", 00:23:19.297 "uuid": "ac79261e-7348-5a56-b77a-c0d6c9ced9e3", 00:23:19.297 "is_configured": true, 00:23:19.297 "data_offset": 2048, 00:23:19.297 "data_size": 63488 00:23:19.297 }, 00:23:19.297 { 00:23:19.297 "name": "BaseBdev3", 00:23:19.297 "uuid": "d48038f5-97db-57ba-926f-c563aec58e9f", 00:23:19.297 "is_configured": true, 00:23:19.297 "data_offset": 2048, 00:23:19.297 "data_size": 63488 00:23:19.297 }, 00:23:19.297 { 00:23:19.297 "name": "BaseBdev4", 00:23:19.297 "uuid": "3ed2a904-3e55-5d3a-a066-ccbd001dcce5", 00:23:19.297 "is_configured": true, 00:23:19.297 "data_offset": 2048, 00:23:19.297 "data_size": 63488 00:23:19.297 } 00:23:19.297 ] 00:23:19.297 }' 00:23:19.297 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.297 10:30:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.864 10:30:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:20.123 [2024-07-15 10:30:57.128893] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:20.123 [2024-07-15 10:30:57.128933] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:20.123 [2024-07-15 10:30:57.132073] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:20.123 [2024-07-15 10:30:57.132109] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:20.123 [2024-07-15 10:30:57.132206] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:20.123 [2024-07-15 10:30:57.132218] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x146fc20 name raid_bdev1, state offline 00:23:20.123 0 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 579283 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 579283 ']' 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 579283 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 579283 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 579283' 00:23:20.123 killing process with pid 579283 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 579283 00:23:20.123 [2024-07-15 10:30:57.196591] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:20.123 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 579283 00:23:20.123 [2024-07-15 10:30:57.227274] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.GSCzc1tprO 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:23:20.382 00:23:20.382 real 0m7.177s 00:23:20.382 user 0m11.315s 00:23:20.382 sys 0m1.329s 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:20.382 10:30:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:23:20.382 ************************************ 00:23:20.382 END TEST raid_write_error_test 00:23:20.382 ************************************ 00:23:20.382 10:30:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:20.382 10:30:57 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:23:20.382 10:30:57 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:20.382 10:30:57 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:23:20.382 10:30:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:20.382 10:30:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:20.382 10:30:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:20.382 ************************************ 00:23:20.382 START TEST raid_rebuild_test 00:23:20.382 ************************************ 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:20.382 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=580420 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 580420 /var/tmp/spdk-raid.sock 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 580420 ']' 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:20.383 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:20.383 10:30:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:20.641 [2024-07-15 10:30:57.613546] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:23:20.641 [2024-07-15 10:30:57.613613] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid580420 ] 00:23:20.641 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:20.641 Zero copy mechanism will not be used. 00:23:20.641 [2024-07-15 10:30:57.741057] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:20.900 [2024-07-15 10:30:57.846397] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:20.900 [2024-07-15 10:30:57.906740] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:20.900 [2024-07-15 10:30:57.906799] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:21.465 10:30:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:21.465 10:30:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:23:21.465 10:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:21.465 10:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:21.723 BaseBdev1_malloc 00:23:21.723 10:30:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:21.982 [2024-07-15 10:30:59.010313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:21.982 [2024-07-15 10:30:59.010360] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:21.982 [2024-07-15 10:30:59.010384] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27acd40 00:23:21.982 [2024-07-15 10:30:59.010397] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:21.982 [2024-07-15 10:30:59.012167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:21.982 [2024-07-15 10:30:59.012199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:21.982 BaseBdev1 00:23:21.982 10:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:21.982 10:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:22.240 BaseBdev2_malloc 00:23:22.240 10:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:22.498 [2024-07-15 10:30:59.504746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:22.498 [2024-07-15 10:30:59.504794] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:22.498 [2024-07-15 10:30:59.504818] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27ad860 00:23:22.498 [2024-07-15 10:30:59.504831] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:22.498 [2024-07-15 10:30:59.506376] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:22.498 [2024-07-15 10:30:59.506404] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:22.498 BaseBdev2 00:23:22.498 10:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:22.756 spare_malloc 00:23:22.756 10:30:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:23.014 spare_delay 00:23:23.014 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:23.272 [2024-07-15 10:31:00.243283] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:23.272 [2024-07-15 10:31:00.243333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.272 [2024-07-15 10:31:00.243356] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x295bec0 00:23:23.272 [2024-07-15 10:31:00.243369] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.272 [2024-07-15 10:31:00.245052] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.272 [2024-07-15 10:31:00.245084] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:23.272 spare 00:23:23.272 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:23.531 [2024-07-15 10:31:00.479919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:23.531 [2024-07-15 10:31:00.481282] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:23.531 [2024-07-15 10:31:00.481363] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x295d070 00:23:23.531 [2024-07-15 10:31:00.481374] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:23.531 [2024-07-15 10:31:00.481587] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2956490 00:23:23.531 [2024-07-15 10:31:00.481729] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x295d070 00:23:23.531 [2024-07-15 10:31:00.481739] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x295d070 00:23:23.531 [2024-07-15 10:31:00.481858] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:23.531 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:23.790 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:23.790 "name": "raid_bdev1", 00:23:23.790 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:23.790 "strip_size_kb": 0, 00:23:23.790 "state": "online", 00:23:23.790 "raid_level": "raid1", 00:23:23.790 "superblock": false, 00:23:23.790 "num_base_bdevs": 2, 00:23:23.790 "num_base_bdevs_discovered": 2, 00:23:23.790 "num_base_bdevs_operational": 2, 00:23:23.790 "base_bdevs_list": [ 00:23:23.790 { 00:23:23.790 "name": "BaseBdev1", 00:23:23.790 "uuid": "b69f4d2a-635f-531c-a32c-33fdf80b1130", 00:23:23.790 "is_configured": true, 00:23:23.790 "data_offset": 0, 00:23:23.790 "data_size": 65536 00:23:23.790 }, 00:23:23.790 { 00:23:23.790 "name": "BaseBdev2", 00:23:23.790 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:23.790 "is_configured": true, 00:23:23.790 "data_offset": 0, 00:23:23.790 "data_size": 65536 00:23:23.790 } 00:23:23.790 ] 00:23:23.790 }' 00:23:23.790 10:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:23.790 10:31:00 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:24.356 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:24.356 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:24.614 [2024-07-15 10:31:01.559011] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:24.614 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:24.614 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.614 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:24.872 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:24.873 10:31:01 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:24.873 [2024-07-15 10:31:02.048093] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2956490 00:23:24.873 /dev/nbd0 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:25.132 1+0 records in 00:23:25.132 1+0 records out 00:23:25.132 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250163 s, 16.4 MB/s 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:25.132 10:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:30.460 65536+0 records in 00:23:30.460 65536+0 records out 00:23:30.460 33554432 bytes (34 MB, 32 MiB) copied, 5.35316 s, 6.3 MB/s 00:23:30.460 10:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:30.460 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:30.460 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:30.460 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:30.460 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:30.460 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:30.460 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:30.719 [2024-07-15 10:31:07.739250] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:30.719 10:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:30.978 [2024-07-15 10:31:07.979923] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.978 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.237 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.237 "name": "raid_bdev1", 00:23:31.238 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:31.238 "strip_size_kb": 0, 00:23:31.238 "state": "online", 00:23:31.238 "raid_level": "raid1", 00:23:31.238 "superblock": false, 00:23:31.238 "num_base_bdevs": 2, 00:23:31.238 "num_base_bdevs_discovered": 1, 00:23:31.238 "num_base_bdevs_operational": 1, 00:23:31.238 "base_bdevs_list": [ 00:23:31.238 { 00:23:31.238 "name": null, 00:23:31.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.238 "is_configured": false, 00:23:31.238 "data_offset": 0, 00:23:31.238 "data_size": 65536 00:23:31.238 }, 00:23:31.238 { 00:23:31.238 "name": "BaseBdev2", 00:23:31.238 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:31.238 "is_configured": true, 00:23:31.238 "data_offset": 0, 00:23:31.238 "data_size": 65536 00:23:31.238 } 00:23:31.238 ] 00:23:31.238 }' 00:23:31.238 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.238 10:31:08 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:31.807 10:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:32.067 [2024-07-15 10:31:09.058800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:32.067 [2024-07-15 10:31:09.063811] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x295d880 00:23:32.067 [2024-07-15 10:31:09.066027] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:32.067 10:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:33.004 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:33.004 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.004 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:33.004 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:33.004 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.004 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.004 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.262 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.262 "name": "raid_bdev1", 00:23:33.262 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:33.262 "strip_size_kb": 0, 00:23:33.262 "state": "online", 00:23:33.262 "raid_level": "raid1", 00:23:33.262 "superblock": false, 00:23:33.262 "num_base_bdevs": 2, 00:23:33.262 "num_base_bdevs_discovered": 2, 00:23:33.262 "num_base_bdevs_operational": 2, 00:23:33.262 "process": { 00:23:33.262 "type": "rebuild", 00:23:33.262 "target": "spare", 00:23:33.262 "progress": { 00:23:33.262 "blocks": 24576, 00:23:33.262 "percent": 37 00:23:33.262 } 00:23:33.262 }, 00:23:33.262 "base_bdevs_list": [ 00:23:33.262 { 00:23:33.262 "name": "spare", 00:23:33.262 "uuid": "6de22a70-d199-5a01-8e00-613a1a19af5d", 00:23:33.262 "is_configured": true, 00:23:33.262 "data_offset": 0, 00:23:33.262 "data_size": 65536 00:23:33.262 }, 00:23:33.262 { 00:23:33.262 "name": "BaseBdev2", 00:23:33.262 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:33.262 "is_configured": true, 00:23:33.262 "data_offset": 0, 00:23:33.262 "data_size": 65536 00:23:33.262 } 00:23:33.262 ] 00:23:33.262 }' 00:23:33.262 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.262 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:33.262 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.262 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:33.262 10:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:33.830 [2024-07-15 10:31:10.917810] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:33.830 [2024-07-15 10:31:10.980922] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:33.830 [2024-07-15 10:31:10.980978] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:33.830 [2024-07-15 10:31:10.980994] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:33.830 [2024-07-15 10:31:10.981002] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.830 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.088 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.088 "name": "raid_bdev1", 00:23:34.088 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:34.088 "strip_size_kb": 0, 00:23:34.088 "state": "online", 00:23:34.088 "raid_level": "raid1", 00:23:34.088 "superblock": false, 00:23:34.088 "num_base_bdevs": 2, 00:23:34.088 "num_base_bdevs_discovered": 1, 00:23:34.088 "num_base_bdevs_operational": 1, 00:23:34.088 "base_bdevs_list": [ 00:23:34.088 { 00:23:34.088 "name": null, 00:23:34.088 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.088 "is_configured": false, 00:23:34.088 "data_offset": 0, 00:23:34.088 "data_size": 65536 00:23:34.088 }, 00:23:34.088 { 00:23:34.088 "name": "BaseBdev2", 00:23:34.088 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:34.088 "is_configured": true, 00:23:34.088 "data_offset": 0, 00:23:34.088 "data_size": 65536 00:23:34.088 } 00:23:34.088 ] 00:23:34.088 }' 00:23:34.088 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.088 10:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:35.023 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:35.023 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.023 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:35.023 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:35.023 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.023 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.023 10:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.023 10:31:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.023 "name": "raid_bdev1", 00:23:35.023 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:35.023 "strip_size_kb": 0, 00:23:35.023 "state": "online", 00:23:35.023 "raid_level": "raid1", 00:23:35.023 "superblock": false, 00:23:35.023 "num_base_bdevs": 2, 00:23:35.023 "num_base_bdevs_discovered": 1, 00:23:35.023 "num_base_bdevs_operational": 1, 00:23:35.023 "base_bdevs_list": [ 00:23:35.023 { 00:23:35.023 "name": null, 00:23:35.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.023 "is_configured": false, 00:23:35.023 "data_offset": 0, 00:23:35.023 "data_size": 65536 00:23:35.023 }, 00:23:35.023 { 00:23:35.023 "name": "BaseBdev2", 00:23:35.023 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:35.023 "is_configured": true, 00:23:35.023 "data_offset": 0, 00:23:35.023 "data_size": 65536 00:23:35.023 } 00:23:35.023 ] 00:23:35.023 }' 00:23:35.023 10:31:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.023 10:31:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:35.023 10:31:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.023 10:31:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:35.023 10:31:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:35.282 [2024-07-15 10:31:12.345001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:35.282 [2024-07-15 10:31:12.349913] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2956490 00:23:35.282 [2024-07-15 10:31:12.351391] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:35.282 10:31:12 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:36.222 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.222 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.222 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.222 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.222 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.222 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.222 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.483 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.483 "name": "raid_bdev1", 00:23:36.483 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:36.483 "strip_size_kb": 0, 00:23:36.483 "state": "online", 00:23:36.483 "raid_level": "raid1", 00:23:36.483 "superblock": false, 00:23:36.483 "num_base_bdevs": 2, 00:23:36.483 "num_base_bdevs_discovered": 2, 00:23:36.483 "num_base_bdevs_operational": 2, 00:23:36.483 "process": { 00:23:36.483 "type": "rebuild", 00:23:36.483 "target": "spare", 00:23:36.483 "progress": { 00:23:36.483 "blocks": 24576, 00:23:36.483 "percent": 37 00:23:36.483 } 00:23:36.483 }, 00:23:36.483 "base_bdevs_list": [ 00:23:36.483 { 00:23:36.483 "name": "spare", 00:23:36.483 "uuid": "6de22a70-d199-5a01-8e00-613a1a19af5d", 00:23:36.483 "is_configured": true, 00:23:36.483 "data_offset": 0, 00:23:36.483 "data_size": 65536 00:23:36.483 }, 00:23:36.483 { 00:23:36.483 "name": "BaseBdev2", 00:23:36.483 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:36.483 "is_configured": true, 00:23:36.483 "data_offset": 0, 00:23:36.483 "data_size": 65536 00:23:36.483 } 00:23:36.483 ] 00:23:36.483 }' 00:23:36.483 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.483 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.483 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.741 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.741 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:36.741 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:36.741 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:36.741 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:36.741 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=758 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.742 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.000 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.000 "name": "raid_bdev1", 00:23:37.000 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:37.000 "strip_size_kb": 0, 00:23:37.000 "state": "online", 00:23:37.000 "raid_level": "raid1", 00:23:37.000 "superblock": false, 00:23:37.000 "num_base_bdevs": 2, 00:23:37.000 "num_base_bdevs_discovered": 2, 00:23:37.000 "num_base_bdevs_operational": 2, 00:23:37.000 "process": { 00:23:37.000 "type": "rebuild", 00:23:37.000 "target": "spare", 00:23:37.000 "progress": { 00:23:37.000 "blocks": 30720, 00:23:37.000 "percent": 46 00:23:37.000 } 00:23:37.000 }, 00:23:37.000 "base_bdevs_list": [ 00:23:37.000 { 00:23:37.000 "name": "spare", 00:23:37.000 "uuid": "6de22a70-d199-5a01-8e00-613a1a19af5d", 00:23:37.000 "is_configured": true, 00:23:37.000 "data_offset": 0, 00:23:37.000 "data_size": 65536 00:23:37.000 }, 00:23:37.000 { 00:23:37.000 "name": "BaseBdev2", 00:23:37.000 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:37.000 "is_configured": true, 00:23:37.000 "data_offset": 0, 00:23:37.000 "data_size": 65536 00:23:37.000 } 00:23:37.000 ] 00:23:37.000 }' 00:23:37.000 10:31:13 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.000 10:31:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:37.000 10:31:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.000 10:31:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:37.000 10:31:14 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.937 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.196 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:38.196 "name": "raid_bdev1", 00:23:38.196 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:38.196 "strip_size_kb": 0, 00:23:38.196 "state": "online", 00:23:38.196 "raid_level": "raid1", 00:23:38.196 "superblock": false, 00:23:38.196 "num_base_bdevs": 2, 00:23:38.196 "num_base_bdevs_discovered": 2, 00:23:38.196 "num_base_bdevs_operational": 2, 00:23:38.196 "process": { 00:23:38.196 "type": "rebuild", 00:23:38.196 "target": "spare", 00:23:38.196 "progress": { 00:23:38.196 "blocks": 59392, 00:23:38.196 "percent": 90 00:23:38.196 } 00:23:38.196 }, 00:23:38.196 "base_bdevs_list": [ 00:23:38.196 { 00:23:38.196 "name": "spare", 00:23:38.196 "uuid": "6de22a70-d199-5a01-8e00-613a1a19af5d", 00:23:38.196 "is_configured": true, 00:23:38.196 "data_offset": 0, 00:23:38.196 "data_size": 65536 00:23:38.196 }, 00:23:38.196 { 00:23:38.196 "name": "BaseBdev2", 00:23:38.196 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:38.196 "is_configured": true, 00:23:38.196 "data_offset": 0, 00:23:38.196 "data_size": 65536 00:23:38.196 } 00:23:38.196 ] 00:23:38.196 }' 00:23:38.196 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:38.196 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:38.196 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:38.196 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:38.196 10:31:15 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:38.455 [2024-07-15 10:31:15.576470] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:38.455 [2024-07-15 10:31:15.576528] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:38.455 [2024-07-15 10:31:15.576565] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.393 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.652 "name": "raid_bdev1", 00:23:39.652 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:39.652 "strip_size_kb": 0, 00:23:39.652 "state": "online", 00:23:39.652 "raid_level": "raid1", 00:23:39.652 "superblock": false, 00:23:39.652 "num_base_bdevs": 2, 00:23:39.652 "num_base_bdevs_discovered": 2, 00:23:39.652 "num_base_bdevs_operational": 2, 00:23:39.652 "base_bdevs_list": [ 00:23:39.652 { 00:23:39.652 "name": "spare", 00:23:39.652 "uuid": "6de22a70-d199-5a01-8e00-613a1a19af5d", 00:23:39.652 "is_configured": true, 00:23:39.652 "data_offset": 0, 00:23:39.652 "data_size": 65536 00:23:39.652 }, 00:23:39.652 { 00:23:39.652 "name": "BaseBdev2", 00:23:39.652 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:39.652 "is_configured": true, 00:23:39.652 "data_offset": 0, 00:23:39.652 "data_size": 65536 00:23:39.652 } 00:23:39.652 ] 00:23:39.652 }' 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.652 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.911 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.911 "name": "raid_bdev1", 00:23:39.911 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:39.911 "strip_size_kb": 0, 00:23:39.911 "state": "online", 00:23:39.911 "raid_level": "raid1", 00:23:39.911 "superblock": false, 00:23:39.911 "num_base_bdevs": 2, 00:23:39.911 "num_base_bdevs_discovered": 2, 00:23:39.911 "num_base_bdevs_operational": 2, 00:23:39.911 "base_bdevs_list": [ 00:23:39.911 { 00:23:39.911 "name": "spare", 00:23:39.911 "uuid": "6de22a70-d199-5a01-8e00-613a1a19af5d", 00:23:39.911 "is_configured": true, 00:23:39.911 "data_offset": 0, 00:23:39.911 "data_size": 65536 00:23:39.911 }, 00:23:39.911 { 00:23:39.911 "name": "BaseBdev2", 00:23:39.911 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:39.911 "is_configured": true, 00:23:39.911 "data_offset": 0, 00:23:39.911 "data_size": 65536 00:23:39.911 } 00:23:39.911 ] 00:23:39.911 }' 00:23:39.912 10:31:16 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.912 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.171 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.171 "name": "raid_bdev1", 00:23:40.171 "uuid": "b0232bf5-b5ce-466b-a187-1db4769a3f1e", 00:23:40.171 "strip_size_kb": 0, 00:23:40.171 "state": "online", 00:23:40.171 "raid_level": "raid1", 00:23:40.171 "superblock": false, 00:23:40.171 "num_base_bdevs": 2, 00:23:40.171 "num_base_bdevs_discovered": 2, 00:23:40.171 "num_base_bdevs_operational": 2, 00:23:40.171 "base_bdevs_list": [ 00:23:40.171 { 00:23:40.171 "name": "spare", 00:23:40.171 "uuid": "6de22a70-d199-5a01-8e00-613a1a19af5d", 00:23:40.171 "is_configured": true, 00:23:40.171 "data_offset": 0, 00:23:40.171 "data_size": 65536 00:23:40.171 }, 00:23:40.171 { 00:23:40.171 "name": "BaseBdev2", 00:23:40.171 "uuid": "4b2d8d6c-2643-5c02-8b51-dd552c89f21a", 00:23:40.171 "is_configured": true, 00:23:40.171 "data_offset": 0, 00:23:40.171 "data_size": 65536 00:23:40.171 } 00:23:40.171 ] 00:23:40.171 }' 00:23:40.171 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.171 10:31:17 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:40.739 10:31:17 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:40.998 [2024-07-15 10:31:18.152468] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:40.998 [2024-07-15 10:31:18.152495] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:40.998 [2024-07-15 10:31:18.152552] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:40.998 [2024-07-15 10:31:18.152612] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:40.998 [2024-07-15 10:31:18.152624] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x295d070 name raid_bdev1, state offline 00:23:40.998 10:31:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:40.998 10:31:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:41.257 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:41.517 /dev/nbd0 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:41.517 1+0 records in 00:23:41.517 1+0 records out 00:23:41.517 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000152758 s, 26.8 MB/s 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:41.517 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:41.776 /dev/nbd1 00:23:41.776 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:41.776 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:41.776 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:41.776 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:42.035 1+0 records in 00:23:42.035 1+0 records out 00:23:42.035 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240846 s, 17.0 MB/s 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:42.035 10:31:18 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:42.035 10:31:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:42.035 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:42.035 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:42.035 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:42.035 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:42.035 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.035 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:42.293 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 580420 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 580420 ']' 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 580420 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:23:42.552 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:42.810 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 580420 00:23:42.810 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:42.810 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:42.810 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 580420' 00:23:42.810 killing process with pid 580420 00:23:42.810 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 580420 00:23:42.810 Received shutdown signal, test time was about 60.000000 seconds 00:23:42.810 00:23:42.810 Latency(us) 00:23:42.810 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:42.810 =================================================================================================================== 00:23:42.810 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:42.810 [2024-07-15 10:31:19.790025] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:42.810 10:31:19 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 580420 00:23:42.810 [2024-07-15 10:31:19.816033] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:43.068 00:23:43.068 real 0m22.487s 00:23:43.068 user 0m30.330s 00:23:43.068 sys 0m5.004s 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:43.068 ************************************ 00:23:43.068 END TEST raid_rebuild_test 00:23:43.068 ************************************ 00:23:43.068 10:31:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:43.068 10:31:20 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:23:43.068 10:31:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:43.068 10:31:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:43.068 10:31:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:43.068 ************************************ 00:23:43.068 START TEST raid_rebuild_test_sb 00:23:43.068 ************************************ 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=583479 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 583479 /var/tmp/spdk-raid.sock 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 583479 ']' 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:43.068 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:43.068 10:31:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:43.068 [2024-07-15 10:31:20.188882] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:23:43.068 [2024-07-15 10:31:20.188958] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid583479 ] 00:23:43.068 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:43.068 Zero copy mechanism will not be used. 00:23:43.325 [2024-07-15 10:31:20.321181] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:43.325 [2024-07-15 10:31:20.422055] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:43.325 [2024-07-15 10:31:20.495478] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:43.325 [2024-07-15 10:31:20.495520] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:44.257 10:31:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:44.257 10:31:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:44.257 10:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:44.257 10:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:44.257 BaseBdev1_malloc 00:23:44.257 10:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:44.514 [2024-07-15 10:31:21.597741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:44.514 [2024-07-15 10:31:21.597787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:44.514 [2024-07-15 10:31:21.597809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a60d40 00:23:44.514 [2024-07-15 10:31:21.597822] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:44.514 [2024-07-15 10:31:21.599406] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:44.514 [2024-07-15 10:31:21.599436] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:44.514 BaseBdev1 00:23:44.514 10:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:44.514 10:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:44.784 BaseBdev2_malloc 00:23:44.784 10:31:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:45.094 [2024-07-15 10:31:22.091945] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:45.094 [2024-07-15 10:31:22.091996] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.094 [2024-07-15 10:31:22.092020] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a61860 00:23:45.094 [2024-07-15 10:31:22.092032] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.094 [2024-07-15 10:31:22.093450] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.094 [2024-07-15 10:31:22.093480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:45.094 BaseBdev2 00:23:45.094 10:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:45.353 spare_malloc 00:23:45.353 10:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:45.611 spare_delay 00:23:45.611 10:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:45.870 [2024-07-15 10:31:22.826370] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:45.870 [2024-07-15 10:31:22.826417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.870 [2024-07-15 10:31:22.826438] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0fec0 00:23:45.870 [2024-07-15 10:31:22.826451] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.870 [2024-07-15 10:31:22.827975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.870 [2024-07-15 10:31:22.828005] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:45.870 spare 00:23:45.870 10:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:46.128 [2024-07-15 10:31:23.071046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:46.128 [2024-07-15 10:31:23.072306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:46.128 [2024-07-15 10:31:23.072480] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c11070 00:23:46.128 [2024-07-15 10:31:23.072493] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:46.128 [2024-07-15 10:31:23.072695] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0a490 00:23:46.128 [2024-07-15 10:31:23.072835] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c11070 00:23:46.128 [2024-07-15 10:31:23.072846] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c11070 00:23:46.128 [2024-07-15 10:31:23.072948] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.128 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.387 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:46.387 "name": "raid_bdev1", 00:23:46.387 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:46.387 "strip_size_kb": 0, 00:23:46.387 "state": "online", 00:23:46.387 "raid_level": "raid1", 00:23:46.387 "superblock": true, 00:23:46.387 "num_base_bdevs": 2, 00:23:46.387 "num_base_bdevs_discovered": 2, 00:23:46.387 "num_base_bdevs_operational": 2, 00:23:46.387 "base_bdevs_list": [ 00:23:46.387 { 00:23:46.387 "name": "BaseBdev1", 00:23:46.387 "uuid": "f4a4d010-1581-503e-a5fe-101de9eb1ec0", 00:23:46.387 "is_configured": true, 00:23:46.387 "data_offset": 2048, 00:23:46.387 "data_size": 63488 00:23:46.387 }, 00:23:46.387 { 00:23:46.387 "name": "BaseBdev2", 00:23:46.387 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:46.387 "is_configured": true, 00:23:46.387 "data_offset": 2048, 00:23:46.387 "data_size": 63488 00:23:46.387 } 00:23:46.387 ] 00:23:46.387 }' 00:23:46.387 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:46.387 10:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:46.953 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:46.953 10:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:47.211 [2024-07-15 10:31:24.158133] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:47.211 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:47.211 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.211 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:47.470 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:47.470 [2024-07-15 10:31:24.643217] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0a490 00:23:47.470 /dev/nbd0 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:47.729 1+0 records in 00:23:47.729 1+0 records out 00:23:47.729 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252751 s, 16.2 MB/s 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:47.729 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:47.730 10:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:51.915 63488+0 records in 00:23:51.915 63488+0 records out 00:23:51.915 32505856 bytes (33 MB, 31 MiB) copied, 4.26323 s, 7.6 MB/s 00:23:51.915 10:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:51.915 10:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:51.915 10:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:51.915 10:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:51.915 10:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:51.915 10:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:51.915 10:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:52.172 [2024-07-15 10:31:29.238431] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:52.172 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:52.429 [2024-07-15 10:31:29.475107] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.429 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.687 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:52.687 "name": "raid_bdev1", 00:23:52.687 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:52.687 "strip_size_kb": 0, 00:23:52.687 "state": "online", 00:23:52.687 "raid_level": "raid1", 00:23:52.687 "superblock": true, 00:23:52.687 "num_base_bdevs": 2, 00:23:52.687 "num_base_bdevs_discovered": 1, 00:23:52.687 "num_base_bdevs_operational": 1, 00:23:52.687 "base_bdevs_list": [ 00:23:52.687 { 00:23:52.687 "name": null, 00:23:52.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:52.687 "is_configured": false, 00:23:52.687 "data_offset": 2048, 00:23:52.687 "data_size": 63488 00:23:52.687 }, 00:23:52.687 { 00:23:52.687 "name": "BaseBdev2", 00:23:52.687 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:52.687 "is_configured": true, 00:23:52.687 "data_offset": 2048, 00:23:52.687 "data_size": 63488 00:23:52.687 } 00:23:52.687 ] 00:23:52.687 }' 00:23:52.687 10:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:52.687 10:31:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:53.254 10:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:53.512 [2024-07-15 10:31:30.529913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:53.512 [2024-07-15 10:31:30.534874] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c10ce0 00:23:53.512 [2024-07-15 10:31:30.537099] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:53.512 10:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:54.449 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:54.449 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.449 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:54.449 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:54.449 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.449 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.449 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.707 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.707 "name": "raid_bdev1", 00:23:54.707 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:54.707 "strip_size_kb": 0, 00:23:54.707 "state": "online", 00:23:54.707 "raid_level": "raid1", 00:23:54.707 "superblock": true, 00:23:54.707 "num_base_bdevs": 2, 00:23:54.707 "num_base_bdevs_discovered": 2, 00:23:54.707 "num_base_bdevs_operational": 2, 00:23:54.707 "process": { 00:23:54.707 "type": "rebuild", 00:23:54.707 "target": "spare", 00:23:54.707 "progress": { 00:23:54.707 "blocks": 24576, 00:23:54.707 "percent": 38 00:23:54.707 } 00:23:54.707 }, 00:23:54.707 "base_bdevs_list": [ 00:23:54.707 { 00:23:54.707 "name": "spare", 00:23:54.707 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:23:54.707 "is_configured": true, 00:23:54.707 "data_offset": 2048, 00:23:54.707 "data_size": 63488 00:23:54.708 }, 00:23:54.708 { 00:23:54.708 "name": "BaseBdev2", 00:23:54.708 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:54.708 "is_configured": true, 00:23:54.708 "data_offset": 2048, 00:23:54.708 "data_size": 63488 00:23:54.708 } 00:23:54.708 ] 00:23:54.708 }' 00:23:54.708 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.708 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:54.708 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.966 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:54.966 10:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:54.966 [2024-07-15 10:31:32.123463] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:54.966 [2024-07-15 10:31:32.149791] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:54.966 [2024-07-15 10:31:32.149838] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.966 [2024-07-15 10:31:32.149854] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:54.966 [2024-07-15 10:31:32.149862] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.225 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.484 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.484 "name": "raid_bdev1", 00:23:55.484 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:55.484 "strip_size_kb": 0, 00:23:55.484 "state": "online", 00:23:55.484 "raid_level": "raid1", 00:23:55.484 "superblock": true, 00:23:55.484 "num_base_bdevs": 2, 00:23:55.484 "num_base_bdevs_discovered": 1, 00:23:55.484 "num_base_bdevs_operational": 1, 00:23:55.484 "base_bdevs_list": [ 00:23:55.484 { 00:23:55.484 "name": null, 00:23:55.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.484 "is_configured": false, 00:23:55.484 "data_offset": 2048, 00:23:55.484 "data_size": 63488 00:23:55.484 }, 00:23:55.484 { 00:23:55.484 "name": "BaseBdev2", 00:23:55.484 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:55.484 "is_configured": true, 00:23:55.484 "data_offset": 2048, 00:23:55.484 "data_size": 63488 00:23:55.484 } 00:23:55.484 ] 00:23:55.484 }' 00:23:55.484 10:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.484 10:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:56.051 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:56.051 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:56.051 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:56.051 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:56.051 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:56.051 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.051 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.309 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.309 "name": "raid_bdev1", 00:23:56.309 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:56.309 "strip_size_kb": 0, 00:23:56.309 "state": "online", 00:23:56.309 "raid_level": "raid1", 00:23:56.309 "superblock": true, 00:23:56.309 "num_base_bdevs": 2, 00:23:56.309 "num_base_bdevs_discovered": 1, 00:23:56.309 "num_base_bdevs_operational": 1, 00:23:56.309 "base_bdevs_list": [ 00:23:56.309 { 00:23:56.309 "name": null, 00:23:56.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.309 "is_configured": false, 00:23:56.309 "data_offset": 2048, 00:23:56.309 "data_size": 63488 00:23:56.309 }, 00:23:56.309 { 00:23:56.309 "name": "BaseBdev2", 00:23:56.309 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:56.309 "is_configured": true, 00:23:56.309 "data_offset": 2048, 00:23:56.309 "data_size": 63488 00:23:56.309 } 00:23:56.309 ] 00:23:56.309 }' 00:23:56.309 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.309 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:56.309 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.309 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.309 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:56.567 [2024-07-15 10:31:33.606105] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:56.567 [2024-07-15 10:31:33.611828] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c10ce0 00:23:56.567 [2024-07-15 10:31:33.613351] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:56.567 10:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:57.504 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:57.504 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:57.504 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:57.504 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:57.504 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:57.504 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.504 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.763 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:57.763 "name": "raid_bdev1", 00:23:57.763 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:57.763 "strip_size_kb": 0, 00:23:57.763 "state": "online", 00:23:57.763 "raid_level": "raid1", 00:23:57.763 "superblock": true, 00:23:57.763 "num_base_bdevs": 2, 00:23:57.763 "num_base_bdevs_discovered": 2, 00:23:57.763 "num_base_bdevs_operational": 2, 00:23:57.763 "process": { 00:23:57.763 "type": "rebuild", 00:23:57.763 "target": "spare", 00:23:57.763 "progress": { 00:23:57.763 "blocks": 24576, 00:23:57.763 "percent": 38 00:23:57.763 } 00:23:57.763 }, 00:23:57.763 "base_bdevs_list": [ 00:23:57.763 { 00:23:57.763 "name": "spare", 00:23:57.764 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:23:57.764 "is_configured": true, 00:23:57.764 "data_offset": 2048, 00:23:57.764 "data_size": 63488 00:23:57.764 }, 00:23:57.764 { 00:23:57.764 "name": "BaseBdev2", 00:23:57.764 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:57.764 "is_configured": true, 00:23:57.764 "data_offset": 2048, 00:23:57.764 "data_size": 63488 00:23:57.764 } 00:23:57.764 ] 00:23:57.764 }' 00:23:57.764 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:57.764 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:57.764 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:58.023 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=779 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.023 10:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.282 10:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:58.282 "name": "raid_bdev1", 00:23:58.282 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:58.282 "strip_size_kb": 0, 00:23:58.282 "state": "online", 00:23:58.282 "raid_level": "raid1", 00:23:58.282 "superblock": true, 00:23:58.282 "num_base_bdevs": 2, 00:23:58.282 "num_base_bdevs_discovered": 2, 00:23:58.282 "num_base_bdevs_operational": 2, 00:23:58.282 "process": { 00:23:58.282 "type": "rebuild", 00:23:58.282 "target": "spare", 00:23:58.282 "progress": { 00:23:58.282 "blocks": 30720, 00:23:58.282 "percent": 48 00:23:58.282 } 00:23:58.282 }, 00:23:58.282 "base_bdevs_list": [ 00:23:58.282 { 00:23:58.282 "name": "spare", 00:23:58.282 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:23:58.282 "is_configured": true, 00:23:58.282 "data_offset": 2048, 00:23:58.282 "data_size": 63488 00:23:58.282 }, 00:23:58.282 { 00:23:58.282 "name": "BaseBdev2", 00:23:58.282 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:58.282 "is_configured": true, 00:23:58.282 "data_offset": 2048, 00:23:58.282 "data_size": 63488 00:23:58.282 } 00:23:58.282 ] 00:23:58.282 }' 00:23:58.282 10:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:58.282 10:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:58.282 10:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:58.282 10:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:58.282 10:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.220 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.479 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:59.479 "name": "raid_bdev1", 00:23:59.479 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:23:59.479 "strip_size_kb": 0, 00:23:59.479 "state": "online", 00:23:59.479 "raid_level": "raid1", 00:23:59.479 "superblock": true, 00:23:59.479 "num_base_bdevs": 2, 00:23:59.479 "num_base_bdevs_discovered": 2, 00:23:59.479 "num_base_bdevs_operational": 2, 00:23:59.479 "process": { 00:23:59.479 "type": "rebuild", 00:23:59.479 "target": "spare", 00:23:59.479 "progress": { 00:23:59.479 "blocks": 57344, 00:23:59.479 "percent": 90 00:23:59.479 } 00:23:59.479 }, 00:23:59.479 "base_bdevs_list": [ 00:23:59.479 { 00:23:59.479 "name": "spare", 00:23:59.479 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:23:59.479 "is_configured": true, 00:23:59.479 "data_offset": 2048, 00:23:59.479 "data_size": 63488 00:23:59.479 }, 00:23:59.479 { 00:23:59.479 "name": "BaseBdev2", 00:23:59.479 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:23:59.479 "is_configured": true, 00:23:59.479 "data_offset": 2048, 00:23:59.479 "data_size": 63488 00:23:59.479 } 00:23:59.479 ] 00:23:59.479 }' 00:23:59.479 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:59.479 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:59.479 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:59.479 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:59.479 10:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:59.738 [2024-07-15 10:31:36.737988] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:59.738 [2024-07-15 10:31:36.738053] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:59.738 [2024-07-15 10:31:36.738140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:00.674 "name": "raid_bdev1", 00:24:00.674 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:00.674 "strip_size_kb": 0, 00:24:00.674 "state": "online", 00:24:00.674 "raid_level": "raid1", 00:24:00.674 "superblock": true, 00:24:00.674 "num_base_bdevs": 2, 00:24:00.674 "num_base_bdevs_discovered": 2, 00:24:00.674 "num_base_bdevs_operational": 2, 00:24:00.674 "base_bdevs_list": [ 00:24:00.674 { 00:24:00.674 "name": "spare", 00:24:00.674 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:24:00.674 "is_configured": true, 00:24:00.674 "data_offset": 2048, 00:24:00.674 "data_size": 63488 00:24:00.674 }, 00:24:00.674 { 00:24:00.674 "name": "BaseBdev2", 00:24:00.674 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:00.674 "is_configured": true, 00:24:00.674 "data_offset": 2048, 00:24:00.674 "data_size": 63488 00:24:00.674 } 00:24:00.674 ] 00:24:00.674 }' 00:24:00.674 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.933 10:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:01.193 "name": "raid_bdev1", 00:24:01.193 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:01.193 "strip_size_kb": 0, 00:24:01.193 "state": "online", 00:24:01.193 "raid_level": "raid1", 00:24:01.193 "superblock": true, 00:24:01.193 "num_base_bdevs": 2, 00:24:01.193 "num_base_bdevs_discovered": 2, 00:24:01.193 "num_base_bdevs_operational": 2, 00:24:01.193 "base_bdevs_list": [ 00:24:01.193 { 00:24:01.193 "name": "spare", 00:24:01.193 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:24:01.193 "is_configured": true, 00:24:01.193 "data_offset": 2048, 00:24:01.193 "data_size": 63488 00:24:01.193 }, 00:24:01.193 { 00:24:01.193 "name": "BaseBdev2", 00:24:01.193 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:01.193 "is_configured": true, 00:24:01.193 "data_offset": 2048, 00:24:01.193 "data_size": 63488 00:24:01.193 } 00:24:01.193 ] 00:24:01.193 }' 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.193 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.452 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.452 "name": "raid_bdev1", 00:24:01.452 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:01.452 "strip_size_kb": 0, 00:24:01.452 "state": "online", 00:24:01.452 "raid_level": "raid1", 00:24:01.452 "superblock": true, 00:24:01.452 "num_base_bdevs": 2, 00:24:01.452 "num_base_bdevs_discovered": 2, 00:24:01.452 "num_base_bdevs_operational": 2, 00:24:01.452 "base_bdevs_list": [ 00:24:01.452 { 00:24:01.452 "name": "spare", 00:24:01.452 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:24:01.452 "is_configured": true, 00:24:01.452 "data_offset": 2048, 00:24:01.452 "data_size": 63488 00:24:01.452 }, 00:24:01.452 { 00:24:01.452 "name": "BaseBdev2", 00:24:01.452 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:01.452 "is_configured": true, 00:24:01.452 "data_offset": 2048, 00:24:01.452 "data_size": 63488 00:24:01.452 } 00:24:01.452 ] 00:24:01.452 }' 00:24:01.452 10:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.452 10:31:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:02.065 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:02.324 [2024-07-15 10:31:39.294078] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:02.324 [2024-07-15 10:31:39.294108] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:02.324 [2024-07-15 10:31:39.294172] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:02.324 [2024-07-15 10:31:39.294230] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:02.324 [2024-07-15 10:31:39.294242] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c11070 name raid_bdev1, state offline 00:24:02.324 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.324 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:02.583 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:02.842 /dev/nbd0 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:02.842 1+0 records in 00:24:02.842 1+0 records out 00:24:02.842 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242014 s, 16.9 MB/s 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:02.842 10:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:03.101 /dev/nbd1 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:03.101 1+0 records in 00:24:03.101 1+0 records out 00:24:03.101 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000321187 s, 12.8 MB/s 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:03.101 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:03.358 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:03.617 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:03.876 10:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:04.134 [2024-07-15 10:31:41.224245] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:04.134 [2024-07-15 10:31:41.224292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:04.134 [2024-07-15 10:31:41.224313] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c10500 00:24:04.134 [2024-07-15 10:31:41.224325] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:04.135 [2024-07-15 10:31:41.226008] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:04.135 [2024-07-15 10:31:41.226040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:04.135 [2024-07-15 10:31:41.226121] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:04.135 [2024-07-15 10:31:41.226151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:04.135 [2024-07-15 10:31:41.226252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:04.135 spare 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.135 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.135 [2024-07-15 10:31:41.326563] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c0f260 00:24:04.135 [2024-07-15 10:31:41.326578] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:04.135 [2024-07-15 10:31:41.326775] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0a490 00:24:04.135 [2024-07-15 10:31:41.326934] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c0f260 00:24:04.135 [2024-07-15 10:31:41.326945] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c0f260 00:24:04.135 [2024-07-15 10:31:41.327050] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:04.393 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.393 "name": "raid_bdev1", 00:24:04.393 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:04.393 "strip_size_kb": 0, 00:24:04.393 "state": "online", 00:24:04.393 "raid_level": "raid1", 00:24:04.393 "superblock": true, 00:24:04.393 "num_base_bdevs": 2, 00:24:04.393 "num_base_bdevs_discovered": 2, 00:24:04.393 "num_base_bdevs_operational": 2, 00:24:04.393 "base_bdevs_list": [ 00:24:04.393 { 00:24:04.393 "name": "spare", 00:24:04.393 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:24:04.393 "is_configured": true, 00:24:04.393 "data_offset": 2048, 00:24:04.393 "data_size": 63488 00:24:04.393 }, 00:24:04.393 { 00:24:04.393 "name": "BaseBdev2", 00:24:04.393 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:04.393 "is_configured": true, 00:24:04.393 "data_offset": 2048, 00:24:04.393 "data_size": 63488 00:24:04.393 } 00:24:04.393 ] 00:24:04.393 }' 00:24:04.393 10:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.393 10:31:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:05.330 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:05.330 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.330 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:05.330 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:05.330 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.330 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.330 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.589 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.589 "name": "raid_bdev1", 00:24:05.589 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:05.589 "strip_size_kb": 0, 00:24:05.589 "state": "online", 00:24:05.589 "raid_level": "raid1", 00:24:05.589 "superblock": true, 00:24:05.589 "num_base_bdevs": 2, 00:24:05.589 "num_base_bdevs_discovered": 2, 00:24:05.589 "num_base_bdevs_operational": 2, 00:24:05.589 "base_bdevs_list": [ 00:24:05.589 { 00:24:05.589 "name": "spare", 00:24:05.589 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:24:05.589 "is_configured": true, 00:24:05.589 "data_offset": 2048, 00:24:05.589 "data_size": 63488 00:24:05.589 }, 00:24:05.589 { 00:24:05.589 "name": "BaseBdev2", 00:24:05.589 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:05.589 "is_configured": true, 00:24:05.589 "data_offset": 2048, 00:24:05.589 "data_size": 63488 00:24:05.589 } 00:24:05.589 ] 00:24:05.589 }' 00:24:05.589 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.589 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:05.589 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.589 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:05.589 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.589 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:05.848 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:05.848 10:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:06.107 [2024-07-15 10:31:43.145486] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.107 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.108 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.108 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.365 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.365 "name": "raid_bdev1", 00:24:06.366 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:06.366 "strip_size_kb": 0, 00:24:06.366 "state": "online", 00:24:06.366 "raid_level": "raid1", 00:24:06.366 "superblock": true, 00:24:06.366 "num_base_bdevs": 2, 00:24:06.366 "num_base_bdevs_discovered": 1, 00:24:06.366 "num_base_bdevs_operational": 1, 00:24:06.366 "base_bdevs_list": [ 00:24:06.366 { 00:24:06.366 "name": null, 00:24:06.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.366 "is_configured": false, 00:24:06.366 "data_offset": 2048, 00:24:06.366 "data_size": 63488 00:24:06.366 }, 00:24:06.366 { 00:24:06.366 "name": "BaseBdev2", 00:24:06.366 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:06.366 "is_configured": true, 00:24:06.366 "data_offset": 2048, 00:24:06.366 "data_size": 63488 00:24:06.366 } 00:24:06.366 ] 00:24:06.366 }' 00:24:06.366 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.366 10:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:06.932 10:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:06.932 [2024-07-15 10:31:44.128093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:06.932 [2024-07-15 10:31:44.128248] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:06.932 [2024-07-15 10:31:44.128265] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:06.932 [2024-07-15 10:31:44.128293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:07.191 [2024-07-15 10:31:44.133105] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0a490 00:24:07.191 [2024-07-15 10:31:44.135421] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:07.191 10:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:08.126 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:08.126 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:08.126 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:08.126 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:08.126 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:08.126 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.126 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.385 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:08.385 "name": "raid_bdev1", 00:24:08.385 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:08.385 "strip_size_kb": 0, 00:24:08.385 "state": "online", 00:24:08.385 "raid_level": "raid1", 00:24:08.385 "superblock": true, 00:24:08.385 "num_base_bdevs": 2, 00:24:08.385 "num_base_bdevs_discovered": 2, 00:24:08.385 "num_base_bdevs_operational": 2, 00:24:08.385 "process": { 00:24:08.385 "type": "rebuild", 00:24:08.385 "target": "spare", 00:24:08.385 "progress": { 00:24:08.385 "blocks": 24576, 00:24:08.385 "percent": 38 00:24:08.385 } 00:24:08.385 }, 00:24:08.385 "base_bdevs_list": [ 00:24:08.385 { 00:24:08.385 "name": "spare", 00:24:08.385 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:24:08.385 "is_configured": true, 00:24:08.385 "data_offset": 2048, 00:24:08.385 "data_size": 63488 00:24:08.385 }, 00:24:08.385 { 00:24:08.385 "name": "BaseBdev2", 00:24:08.385 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:08.385 "is_configured": true, 00:24:08.385 "data_offset": 2048, 00:24:08.385 "data_size": 63488 00:24:08.385 } 00:24:08.385 ] 00:24:08.385 }' 00:24:08.385 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:08.385 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:08.385 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:08.385 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:08.385 10:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:08.953 [2024-07-15 10:31:45.970696] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:08.953 [2024-07-15 10:31:46.050144] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:08.953 [2024-07-15 10:31:46.050188] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:08.953 [2024-07-15 10:31:46.050204] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:08.953 [2024-07-15 10:31:46.050213] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.953 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.212 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:09.212 "name": "raid_bdev1", 00:24:09.212 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:09.212 "strip_size_kb": 0, 00:24:09.212 "state": "online", 00:24:09.212 "raid_level": "raid1", 00:24:09.212 "superblock": true, 00:24:09.212 "num_base_bdevs": 2, 00:24:09.212 "num_base_bdevs_discovered": 1, 00:24:09.212 "num_base_bdevs_operational": 1, 00:24:09.212 "base_bdevs_list": [ 00:24:09.212 { 00:24:09.212 "name": null, 00:24:09.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.212 "is_configured": false, 00:24:09.212 "data_offset": 2048, 00:24:09.212 "data_size": 63488 00:24:09.212 }, 00:24:09.212 { 00:24:09.212 "name": "BaseBdev2", 00:24:09.212 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:09.212 "is_configured": true, 00:24:09.212 "data_offset": 2048, 00:24:09.212 "data_size": 63488 00:24:09.212 } 00:24:09.212 ] 00:24:09.212 }' 00:24:09.212 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:09.212 10:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:09.778 10:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:10.036 [2024-07-15 10:31:47.141389] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:10.036 [2024-07-15 10:31:47.141442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:10.036 [2024-07-15 10:31:47.141464] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c10730 00:24:10.036 [2024-07-15 10:31:47.141477] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:10.036 [2024-07-15 10:31:47.141858] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:10.036 [2024-07-15 10:31:47.141877] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:10.036 [2024-07-15 10:31:47.141967] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:10.036 [2024-07-15 10:31:47.141979] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:10.036 [2024-07-15 10:31:47.141990] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:10.036 [2024-07-15 10:31:47.142009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:10.036 [2024-07-15 10:31:47.146905] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c133d0 00:24:10.036 spare 00:24:10.036 [2024-07-15 10:31:47.148373] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:10.036 10:31:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:10.971 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.972 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.972 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.972 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.972 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.230 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.230 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.230 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.230 "name": "raid_bdev1", 00:24:11.230 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:11.230 "strip_size_kb": 0, 00:24:11.230 "state": "online", 00:24:11.230 "raid_level": "raid1", 00:24:11.230 "superblock": true, 00:24:11.230 "num_base_bdevs": 2, 00:24:11.230 "num_base_bdevs_discovered": 2, 00:24:11.230 "num_base_bdevs_operational": 2, 00:24:11.230 "process": { 00:24:11.230 "type": "rebuild", 00:24:11.230 "target": "spare", 00:24:11.230 "progress": { 00:24:11.230 "blocks": 24576, 00:24:11.230 "percent": 38 00:24:11.230 } 00:24:11.230 }, 00:24:11.230 "base_bdevs_list": [ 00:24:11.230 { 00:24:11.230 "name": "spare", 00:24:11.230 "uuid": "2ce4ac34-b591-5690-b923-1d2ac5578209", 00:24:11.230 "is_configured": true, 00:24:11.230 "data_offset": 2048, 00:24:11.230 "data_size": 63488 00:24:11.230 }, 00:24:11.230 { 00:24:11.230 "name": "BaseBdev2", 00:24:11.230 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:11.230 "is_configured": true, 00:24:11.230 "data_offset": 2048, 00:24:11.230 "data_size": 63488 00:24:11.230 } 00:24:11.230 ] 00:24:11.230 }' 00:24:11.230 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.489 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:11.489 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.489 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:11.489 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:11.489 [2024-07-15 10:31:48.683078] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.748 [2024-07-15 10:31:48.760705] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:11.748 [2024-07-15 10:31:48.760763] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.748 [2024-07-15 10:31:48.760779] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:11.748 [2024-07-15 10:31:48.760787] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.748 10:31:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.006 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.006 "name": "raid_bdev1", 00:24:12.006 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:12.006 "strip_size_kb": 0, 00:24:12.006 "state": "online", 00:24:12.006 "raid_level": "raid1", 00:24:12.006 "superblock": true, 00:24:12.006 "num_base_bdevs": 2, 00:24:12.006 "num_base_bdevs_discovered": 1, 00:24:12.006 "num_base_bdevs_operational": 1, 00:24:12.006 "base_bdevs_list": [ 00:24:12.006 { 00:24:12.006 "name": null, 00:24:12.006 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.006 "is_configured": false, 00:24:12.006 "data_offset": 2048, 00:24:12.006 "data_size": 63488 00:24:12.006 }, 00:24:12.006 { 00:24:12.006 "name": "BaseBdev2", 00:24:12.006 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:12.006 "is_configured": true, 00:24:12.006 "data_offset": 2048, 00:24:12.006 "data_size": 63488 00:24:12.006 } 00:24:12.006 ] 00:24:12.006 }' 00:24:12.006 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.006 10:31:49 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:12.574 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:12.574 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:12.574 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:12.574 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:12.574 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:12.574 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.574 10:31:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.142 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:13.142 "name": "raid_bdev1", 00:24:13.142 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:13.142 "strip_size_kb": 0, 00:24:13.142 "state": "online", 00:24:13.142 "raid_level": "raid1", 00:24:13.142 "superblock": true, 00:24:13.142 "num_base_bdevs": 2, 00:24:13.142 "num_base_bdevs_discovered": 1, 00:24:13.142 "num_base_bdevs_operational": 1, 00:24:13.142 "base_bdevs_list": [ 00:24:13.142 { 00:24:13.142 "name": null, 00:24:13.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:13.142 "is_configured": false, 00:24:13.142 "data_offset": 2048, 00:24:13.142 "data_size": 63488 00:24:13.142 }, 00:24:13.142 { 00:24:13.142 "name": "BaseBdev2", 00:24:13.142 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:13.142 "is_configured": true, 00:24:13.142 "data_offset": 2048, 00:24:13.142 "data_size": 63488 00:24:13.142 } 00:24:13.142 ] 00:24:13.142 }' 00:24:13.142 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:13.142 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:13.142 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:13.142 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:13.142 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:13.401 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:13.401 [2024-07-15 10:31:50.590710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:13.401 [2024-07-15 10:31:50.590768] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:13.401 [2024-07-15 10:31:50.590790] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0ada0 00:24:13.401 [2024-07-15 10:31:50.590803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:13.401 [2024-07-15 10:31:50.591187] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:13.401 [2024-07-15 10:31:50.591206] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:13.401 [2024-07-15 10:31:50.591277] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:13.401 [2024-07-15 10:31:50.591291] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:13.401 [2024-07-15 10:31:50.591301] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:13.401 BaseBdev1 00:24:13.659 10:31:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.594 10:31:51 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.162 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.162 "name": "raid_bdev1", 00:24:15.162 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:15.162 "strip_size_kb": 0, 00:24:15.162 "state": "online", 00:24:15.162 "raid_level": "raid1", 00:24:15.162 "superblock": true, 00:24:15.162 "num_base_bdevs": 2, 00:24:15.162 "num_base_bdevs_discovered": 1, 00:24:15.162 "num_base_bdevs_operational": 1, 00:24:15.162 "base_bdevs_list": [ 00:24:15.162 { 00:24:15.162 "name": null, 00:24:15.162 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.162 "is_configured": false, 00:24:15.162 "data_offset": 2048, 00:24:15.162 "data_size": 63488 00:24:15.162 }, 00:24:15.162 { 00:24:15.162 "name": "BaseBdev2", 00:24:15.162 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:15.162 "is_configured": true, 00:24:15.162 "data_offset": 2048, 00:24:15.162 "data_size": 63488 00:24:15.162 } 00:24:15.162 ] 00:24:15.162 }' 00:24:15.162 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.162 10:31:52 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:15.730 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:15.730 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:15.730 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:15.730 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:15.730 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:15.730 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.730 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.988 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:15.988 "name": "raid_bdev1", 00:24:15.988 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:15.988 "strip_size_kb": 0, 00:24:15.988 "state": "online", 00:24:15.988 "raid_level": "raid1", 00:24:15.988 "superblock": true, 00:24:15.988 "num_base_bdevs": 2, 00:24:15.988 "num_base_bdevs_discovered": 1, 00:24:15.988 "num_base_bdevs_operational": 1, 00:24:15.988 "base_bdevs_list": [ 00:24:15.988 { 00:24:15.988 "name": null, 00:24:15.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:15.988 "is_configured": false, 00:24:15.988 "data_offset": 2048, 00:24:15.988 "data_size": 63488 00:24:15.988 }, 00:24:15.988 { 00:24:15.988 "name": "BaseBdev2", 00:24:15.988 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:15.988 "is_configured": true, 00:24:15.988 "data_offset": 2048, 00:24:15.988 "data_size": 63488 00:24:15.988 } 00:24:15.988 ] 00:24:15.988 }' 00:24:15.988 10:31:52 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:15.988 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:16.245 [2024-07-15 10:31:53.277842] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:16.245 [2024-07-15 10:31:53.277978] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:16.245 [2024-07-15 10:31:53.277995] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:16.245 request: 00:24:16.245 { 00:24:16.245 "base_bdev": "BaseBdev1", 00:24:16.245 "raid_bdev": "raid_bdev1", 00:24:16.245 "method": "bdev_raid_add_base_bdev", 00:24:16.245 "req_id": 1 00:24:16.245 } 00:24:16.245 Got JSON-RPC error response 00:24:16.245 response: 00:24:16.245 { 00:24:16.245 "code": -22, 00:24:16.245 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:16.245 } 00:24:16.245 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:24:16.245 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:16.245 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:16.245 10:31:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:16.245 10:31:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.246 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.506 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.506 "name": "raid_bdev1", 00:24:17.506 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:17.506 "strip_size_kb": 0, 00:24:17.506 "state": "online", 00:24:17.506 "raid_level": "raid1", 00:24:17.506 "superblock": true, 00:24:17.506 "num_base_bdevs": 2, 00:24:17.506 "num_base_bdevs_discovered": 1, 00:24:17.506 "num_base_bdevs_operational": 1, 00:24:17.506 "base_bdevs_list": [ 00:24:17.506 { 00:24:17.506 "name": null, 00:24:17.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.506 "is_configured": false, 00:24:17.506 "data_offset": 2048, 00:24:17.506 "data_size": 63488 00:24:17.506 }, 00:24:17.506 { 00:24:17.506 "name": "BaseBdev2", 00:24:17.506 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:17.506 "is_configured": true, 00:24:17.506 "data_offset": 2048, 00:24:17.506 "data_size": 63488 00:24:17.506 } 00:24:17.506 ] 00:24:17.506 }' 00:24:17.506 10:31:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.506 10:31:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:18.073 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:18.073 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:18.073 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:18.073 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:18.073 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:18.073 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.073 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:18.332 "name": "raid_bdev1", 00:24:18.332 "uuid": "be4504b2-1a4c-490e-85bd-f628db0dd758", 00:24:18.332 "strip_size_kb": 0, 00:24:18.332 "state": "online", 00:24:18.332 "raid_level": "raid1", 00:24:18.332 "superblock": true, 00:24:18.332 "num_base_bdevs": 2, 00:24:18.332 "num_base_bdevs_discovered": 1, 00:24:18.332 "num_base_bdevs_operational": 1, 00:24:18.332 "base_bdevs_list": [ 00:24:18.332 { 00:24:18.332 "name": null, 00:24:18.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:18.332 "is_configured": false, 00:24:18.332 "data_offset": 2048, 00:24:18.332 "data_size": 63488 00:24:18.332 }, 00:24:18.332 { 00:24:18.332 "name": "BaseBdev2", 00:24:18.332 "uuid": "08358a2e-6d04-5c66-96f4-8f32bfb83e49", 00:24:18.332 "is_configured": true, 00:24:18.332 "data_offset": 2048, 00:24:18.332 "data_size": 63488 00:24:18.332 } 00:24:18.332 ] 00:24:18.332 }' 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 583479 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 583479 ']' 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 583479 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 583479 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 583479' 00:24:18.332 killing process with pid 583479 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 583479 00:24:18.332 Received shutdown signal, test time was about 60.000000 seconds 00:24:18.332 00:24:18.332 Latency(us) 00:24:18.332 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:18.332 =================================================================================================================== 00:24:18.332 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:18.332 [2024-07-15 10:31:55.480852] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:18.332 [2024-07-15 10:31:55.480956] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:18.332 [2024-07-15 10:31:55.481000] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:18.332 [2024-07-15 10:31:55.481012] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c0f260 name raid_bdev1, state offline 00:24:18.332 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 583479 00:24:18.332 [2024-07-15 10:31:55.507267] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:18.591 10:31:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:18.591 00:24:18.591 real 0m35.595s 00:24:18.591 user 0m52.814s 00:24:18.591 sys 0m6.496s 00:24:18.591 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:18.591 10:31:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:18.591 ************************************ 00:24:18.591 END TEST raid_rebuild_test_sb 00:24:18.591 ************************************ 00:24:18.591 10:31:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:18.591 10:31:55 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:24:18.591 10:31:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:18.591 10:31:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:18.591 10:31:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:18.849 ************************************ 00:24:18.849 START TEST raid_rebuild_test_io 00:24:18.849 ************************************ 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=588523 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 588523 /var/tmp/spdk-raid.sock 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 588523 ']' 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:18.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:18.849 10:31:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:18.849 [2024-07-15 10:31:55.867331] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:24:18.850 [2024-07-15 10:31:55.867403] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid588523 ] 00:24:18.850 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:18.850 Zero copy mechanism will not be used. 00:24:18.850 [2024-07-15 10:31:55.996142] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:19.108 [2024-07-15 10:31:56.100228] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:19.108 [2024-07-15 10:31:56.166645] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:19.108 [2024-07-15 10:31:56.166682] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:19.674 10:31:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:19.674 10:31:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:24:19.674 10:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:19.674 10:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:19.932 BaseBdev1_malloc 00:24:19.932 10:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:20.189 [2024-07-15 10:31:57.270512] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:20.189 [2024-07-15 10:31:57.270562] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.189 [2024-07-15 10:31:57.270588] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10b0d40 00:24:20.189 [2024-07-15 10:31:57.270601] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.189 [2024-07-15 10:31:57.272367] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.189 [2024-07-15 10:31:57.272397] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:20.189 BaseBdev1 00:24:20.189 10:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:20.189 10:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:20.447 BaseBdev2_malloc 00:24:20.447 10:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:20.705 [2024-07-15 10:31:57.761992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:20.705 [2024-07-15 10:31:57.762041] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.705 [2024-07-15 10:31:57.762066] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10b1860 00:24:20.705 [2024-07-15 10:31:57.762079] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.705 [2024-07-15 10:31:57.763600] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.705 [2024-07-15 10:31:57.763628] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:20.705 BaseBdev2 00:24:20.705 10:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:20.963 spare_malloc 00:24:20.963 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:21.221 spare_delay 00:24:21.221 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:21.479 [2024-07-15 10:31:58.520606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:21.479 [2024-07-15 10:31:58.520659] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.479 [2024-07-15 10:31:58.520682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125fec0 00:24:21.479 [2024-07-15 10:31:58.520694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.479 [2024-07-15 10:31:58.522316] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.479 [2024-07-15 10:31:58.522345] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:21.479 spare 00:24:21.479 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:21.738 [2024-07-15 10:31:58.765268] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:21.738 [2024-07-15 10:31:58.766634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:21.738 [2024-07-15 10:31:58.766713] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1261070 00:24:21.738 [2024-07-15 10:31:58.766724] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:21.738 [2024-07-15 10:31:58.766944] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x125a490 00:24:21.738 [2024-07-15 10:31:58.767088] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1261070 00:24:21.738 [2024-07-15 10:31:58.767099] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1261070 00:24:21.738 [2024-07-15 10:31:58.767216] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.738 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:21.738 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:21.738 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:21.738 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.739 10:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.997 10:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:21.997 "name": "raid_bdev1", 00:24:21.997 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:21.997 "strip_size_kb": 0, 00:24:21.997 "state": "online", 00:24:21.997 "raid_level": "raid1", 00:24:21.997 "superblock": false, 00:24:21.997 "num_base_bdevs": 2, 00:24:21.997 "num_base_bdevs_discovered": 2, 00:24:21.997 "num_base_bdevs_operational": 2, 00:24:21.997 "base_bdevs_list": [ 00:24:21.997 { 00:24:21.997 "name": "BaseBdev1", 00:24:21.997 "uuid": "3eb65551-fe43-5766-8f13-2e73e425c3fb", 00:24:21.997 "is_configured": true, 00:24:21.997 "data_offset": 0, 00:24:21.997 "data_size": 65536 00:24:21.997 }, 00:24:21.997 { 00:24:21.997 "name": "BaseBdev2", 00:24:21.997 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:21.997 "is_configured": true, 00:24:21.997 "data_offset": 0, 00:24:21.997 "data_size": 65536 00:24:21.997 } 00:24:21.997 ] 00:24:21.997 }' 00:24:21.998 10:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:21.998 10:31:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:22.565 10:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:22.565 10:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:22.824 [2024-07-15 10:31:59.876451] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:22.824 10:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:22.824 10:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.824 10:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:23.082 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:23.082 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:23.082 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:23.082 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:23.082 [2024-07-15 10:32:00.247550] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x125bbd0 00:24:23.082 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:23.082 Zero copy mechanism will not be used. 00:24:23.082 Running I/O for 60 seconds... 00:24:23.649 [2024-07-15 10:32:00.627359] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:23.649 [2024-07-15 10:32:00.643493] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x125bbd0 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.649 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.907 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.907 "name": "raid_bdev1", 00:24:23.907 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:23.907 "strip_size_kb": 0, 00:24:23.907 "state": "online", 00:24:23.907 "raid_level": "raid1", 00:24:23.907 "superblock": false, 00:24:23.907 "num_base_bdevs": 2, 00:24:23.907 "num_base_bdevs_discovered": 1, 00:24:23.907 "num_base_bdevs_operational": 1, 00:24:23.907 "base_bdevs_list": [ 00:24:23.907 { 00:24:23.907 "name": null, 00:24:23.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.908 "is_configured": false, 00:24:23.908 "data_offset": 0, 00:24:23.908 "data_size": 65536 00:24:23.908 }, 00:24:23.908 { 00:24:23.908 "name": "BaseBdev2", 00:24:23.908 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:23.908 "is_configured": true, 00:24:23.908 "data_offset": 0, 00:24:23.908 "data_size": 65536 00:24:23.908 } 00:24:23.908 ] 00:24:23.908 }' 00:24:23.908 10:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.908 10:32:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:24.474 10:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:24.732 [2024-07-15 10:32:01.772360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:24.732 10:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:24.732 [2024-07-15 10:32:01.847531] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e38b0 00:24:24.732 [2024-07-15 10:32:01.849921] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:24.990 [2024-07-15 10:32:01.968880] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:24.990 [2024-07-15 10:32:01.969321] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:25.248 [2024-07-15 10:32:02.189018] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:25.248 [2024-07-15 10:32:02.189237] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:25.505 [2024-07-15 10:32:02.529956] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:25.505 [2024-07-15 10:32:02.530196] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:25.505 [2024-07-15 10:32:02.647601] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:25.505 [2024-07-15 10:32:02.647719] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:25.762 10:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:25.762 10:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:25.762 10:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:25.762 10:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:25.762 10:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:25.762 10:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.762 10:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.020 [2024-07-15 10:32:03.004549] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:26.020 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:26.020 "name": "raid_bdev1", 00:24:26.020 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:26.020 "strip_size_kb": 0, 00:24:26.020 "state": "online", 00:24:26.020 "raid_level": "raid1", 00:24:26.020 "superblock": false, 00:24:26.020 "num_base_bdevs": 2, 00:24:26.020 "num_base_bdevs_discovered": 2, 00:24:26.020 "num_base_bdevs_operational": 2, 00:24:26.020 "process": { 00:24:26.020 "type": "rebuild", 00:24:26.020 "target": "spare", 00:24:26.020 "progress": { 00:24:26.020 "blocks": 14336, 00:24:26.020 "percent": 21 00:24:26.020 } 00:24:26.020 }, 00:24:26.020 "base_bdevs_list": [ 00:24:26.020 { 00:24:26.020 "name": "spare", 00:24:26.020 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:26.020 "is_configured": true, 00:24:26.020 "data_offset": 0, 00:24:26.020 "data_size": 65536 00:24:26.020 }, 00:24:26.020 { 00:24:26.020 "name": "BaseBdev2", 00:24:26.020 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:26.020 "is_configured": true, 00:24:26.020 "data_offset": 0, 00:24:26.020 "data_size": 65536 00:24:26.020 } 00:24:26.020 ] 00:24:26.020 }' 00:24:26.020 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:26.020 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:26.020 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:26.020 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:26.020 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:26.020 [2024-07-15 10:32:03.217082] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:26.277 [2024-07-15 10:32:03.416772] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:26.277 [2024-07-15 10:32:03.448532] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:26.277 [2024-07-15 10:32:03.457704] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:26.277 [2024-07-15 10:32:03.467519] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:26.277 [2024-07-15 10:32:03.467548] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:26.277 [2024-07-15 10:32:03.467559] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:26.277 [2024-07-15 10:32:03.474204] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x125bbd0 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.534 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.792 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.792 "name": "raid_bdev1", 00:24:26.792 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:26.792 "strip_size_kb": 0, 00:24:26.792 "state": "online", 00:24:26.792 "raid_level": "raid1", 00:24:26.792 "superblock": false, 00:24:26.792 "num_base_bdevs": 2, 00:24:26.792 "num_base_bdevs_discovered": 1, 00:24:26.792 "num_base_bdevs_operational": 1, 00:24:26.792 "base_bdevs_list": [ 00:24:26.792 { 00:24:26.792 "name": null, 00:24:26.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.792 "is_configured": false, 00:24:26.792 "data_offset": 0, 00:24:26.792 "data_size": 65536 00:24:26.792 }, 00:24:26.792 { 00:24:26.792 "name": "BaseBdev2", 00:24:26.792 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:26.792 "is_configured": true, 00:24:26.792 "data_offset": 0, 00:24:26.792 "data_size": 65536 00:24:26.792 } 00:24:26.792 ] 00:24:26.792 }' 00:24:26.792 10:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.792 10:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:27.358 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.358 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.358 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.358 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.358 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.358 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.358 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.616 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.616 "name": "raid_bdev1", 00:24:27.616 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:27.616 "strip_size_kb": 0, 00:24:27.616 "state": "online", 00:24:27.616 "raid_level": "raid1", 00:24:27.616 "superblock": false, 00:24:27.616 "num_base_bdevs": 2, 00:24:27.616 "num_base_bdevs_discovered": 1, 00:24:27.616 "num_base_bdevs_operational": 1, 00:24:27.616 "base_bdevs_list": [ 00:24:27.616 { 00:24:27.616 "name": null, 00:24:27.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.616 "is_configured": false, 00:24:27.616 "data_offset": 0, 00:24:27.617 "data_size": 65536 00:24:27.617 }, 00:24:27.617 { 00:24:27.617 "name": "BaseBdev2", 00:24:27.617 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:27.617 "is_configured": true, 00:24:27.617 "data_offset": 0, 00:24:27.617 "data_size": 65536 00:24:27.617 } 00:24:27.617 ] 00:24:27.617 }' 00:24:27.617 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.617 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:27.617 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.617 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:27.617 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:27.875 [2024-07-15 10:32:04.917702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:27.875 10:32:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:27.875 [2024-07-15 10:32:04.993424] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1261450 00:24:27.875 [2024-07-15 10:32:04.994906] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:28.133 [2024-07-15 10:32:05.113794] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:28.133 [2024-07-15 10:32:05.114189] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:28.133 [2024-07-15 10:32:05.233566] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:28.133 [2024-07-15 10:32:05.233859] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:28.391 [2024-07-15 10:32:05.487897] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:28.391 [2024-07-15 10:32:05.488437] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:28.649 [2024-07-15 10:32:05.698168] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:28.649 [2024-07-15 10:32:05.698296] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:28.907 10:32:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:28.907 10:32:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:28.907 10:32:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:28.907 10:32:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:28.907 10:32:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:28.907 10:32:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.907 10:32:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.165 [2024-07-15 10:32:06.145639] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:29.165 [2024-07-15 10:32:06.145822] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.165 "name": "raid_bdev1", 00:24:29.165 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:29.165 "strip_size_kb": 0, 00:24:29.165 "state": "online", 00:24:29.165 "raid_level": "raid1", 00:24:29.165 "superblock": false, 00:24:29.165 "num_base_bdevs": 2, 00:24:29.165 "num_base_bdevs_discovered": 2, 00:24:29.165 "num_base_bdevs_operational": 2, 00:24:29.165 "process": { 00:24:29.165 "type": "rebuild", 00:24:29.165 "target": "spare", 00:24:29.165 "progress": { 00:24:29.165 "blocks": 14336, 00:24:29.165 "percent": 21 00:24:29.165 } 00:24:29.165 }, 00:24:29.165 "base_bdevs_list": [ 00:24:29.165 { 00:24:29.165 "name": "spare", 00:24:29.165 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:29.165 "is_configured": true, 00:24:29.165 "data_offset": 0, 00:24:29.165 "data_size": 65536 00:24:29.165 }, 00:24:29.165 { 00:24:29.165 "name": "BaseBdev2", 00:24:29.165 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:29.165 "is_configured": true, 00:24:29.165 "data_offset": 0, 00:24:29.165 "data_size": 65536 00:24:29.165 } 00:24:29.165 ] 00:24:29.165 }' 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=811 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.165 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.423 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.423 "name": "raid_bdev1", 00:24:29.423 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:29.423 "strip_size_kb": 0, 00:24:29.423 "state": "online", 00:24:29.423 "raid_level": "raid1", 00:24:29.423 "superblock": false, 00:24:29.423 "num_base_bdevs": 2, 00:24:29.423 "num_base_bdevs_discovered": 2, 00:24:29.423 "num_base_bdevs_operational": 2, 00:24:29.423 "process": { 00:24:29.423 "type": "rebuild", 00:24:29.423 "target": "spare", 00:24:29.423 "progress": { 00:24:29.423 "blocks": 18432, 00:24:29.423 "percent": 28 00:24:29.423 } 00:24:29.423 }, 00:24:29.423 "base_bdevs_list": [ 00:24:29.423 { 00:24:29.423 "name": "spare", 00:24:29.423 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:29.423 "is_configured": true, 00:24:29.423 "data_offset": 0, 00:24:29.423 "data_size": 65536 00:24:29.423 }, 00:24:29.423 { 00:24:29.423 "name": "BaseBdev2", 00:24:29.423 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:29.423 "is_configured": true, 00:24:29.423 "data_offset": 0, 00:24:29.423 "data_size": 65536 00:24:29.423 } 00:24:29.423 ] 00:24:29.423 }' 00:24:29.423 [2024-07-15 10:32:06.511225] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:29.423 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.423 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.423 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.423 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:29.423 10:32:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:29.682 [2024-07-15 10:32:06.739458] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:29.682 [2024-07-15 10:32:06.739653] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:29.939 [2024-07-15 10:32:07.120657] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:30.196 [2024-07-15 10:32:07.339156] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.456 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.746 [2024-07-15 10:32:07.789097] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:30.746 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:30.746 "name": "raid_bdev1", 00:24:30.746 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:30.746 "strip_size_kb": 0, 00:24:30.746 "state": "online", 00:24:30.746 "raid_level": "raid1", 00:24:30.746 "superblock": false, 00:24:30.746 "num_base_bdevs": 2, 00:24:30.746 "num_base_bdevs_discovered": 2, 00:24:30.746 "num_base_bdevs_operational": 2, 00:24:30.746 "process": { 00:24:30.746 "type": "rebuild", 00:24:30.746 "target": "spare", 00:24:30.746 "progress": { 00:24:30.746 "blocks": 34816, 00:24:30.746 "percent": 53 00:24:30.746 } 00:24:30.746 }, 00:24:30.746 "base_bdevs_list": [ 00:24:30.746 { 00:24:30.746 "name": "spare", 00:24:30.746 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:30.746 "is_configured": true, 00:24:30.746 "data_offset": 0, 00:24:30.746 "data_size": 65536 00:24:30.746 }, 00:24:30.746 { 00:24:30.746 "name": "BaseBdev2", 00:24:30.746 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:30.746 "is_configured": true, 00:24:30.746 "data_offset": 0, 00:24:30.746 "data_size": 65536 00:24:30.746 } 00:24:30.746 ] 00:24:30.746 }' 00:24:30.746 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:30.746 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:30.746 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:31.005 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:31.005 10:32:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:31.262 [2024-07-15 10:32:08.460376] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.828 10:32:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.086 10:32:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:32.086 "name": "raid_bdev1", 00:24:32.086 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:32.086 "strip_size_kb": 0, 00:24:32.086 "state": "online", 00:24:32.086 "raid_level": "raid1", 00:24:32.086 "superblock": false, 00:24:32.086 "num_base_bdevs": 2, 00:24:32.086 "num_base_bdevs_discovered": 2, 00:24:32.086 "num_base_bdevs_operational": 2, 00:24:32.086 "process": { 00:24:32.086 "type": "rebuild", 00:24:32.086 "target": "spare", 00:24:32.086 "progress": { 00:24:32.086 "blocks": 55296, 00:24:32.086 "percent": 84 00:24:32.086 } 00:24:32.086 }, 00:24:32.086 "base_bdevs_list": [ 00:24:32.086 { 00:24:32.086 "name": "spare", 00:24:32.086 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:32.086 "is_configured": true, 00:24:32.086 "data_offset": 0, 00:24:32.086 "data_size": 65536 00:24:32.086 }, 00:24:32.086 { 00:24:32.086 "name": "BaseBdev2", 00:24:32.086 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:32.086 "is_configured": true, 00:24:32.086 "data_offset": 0, 00:24:32.086 "data_size": 65536 00:24:32.086 } 00:24:32.086 ] 00:24:32.086 }' 00:24:32.086 10:32:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:32.086 10:32:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:32.086 10:32:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:32.345 10:32:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:32.345 10:32:09 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:32.604 [2024-07-15 10:32:09.684846] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:32.604 [2024-07-15 10:32:09.785109] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:32.604 [2024-07-15 10:32:09.786703] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.172 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.431 "name": "raid_bdev1", 00:24:33.431 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:33.431 "strip_size_kb": 0, 00:24:33.431 "state": "online", 00:24:33.431 "raid_level": "raid1", 00:24:33.431 "superblock": false, 00:24:33.431 "num_base_bdevs": 2, 00:24:33.431 "num_base_bdevs_discovered": 2, 00:24:33.431 "num_base_bdevs_operational": 2, 00:24:33.431 "base_bdevs_list": [ 00:24:33.431 { 00:24:33.431 "name": "spare", 00:24:33.431 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:33.431 "is_configured": true, 00:24:33.431 "data_offset": 0, 00:24:33.431 "data_size": 65536 00:24:33.431 }, 00:24:33.431 { 00:24:33.431 "name": "BaseBdev2", 00:24:33.431 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:33.431 "is_configured": true, 00:24:33.431 "data_offset": 0, 00:24:33.431 "data_size": 65536 00:24:33.431 } 00:24:33.431 ] 00:24:33.431 }' 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.431 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.691 "name": "raid_bdev1", 00:24:33.691 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:33.691 "strip_size_kb": 0, 00:24:33.691 "state": "online", 00:24:33.691 "raid_level": "raid1", 00:24:33.691 "superblock": false, 00:24:33.691 "num_base_bdevs": 2, 00:24:33.691 "num_base_bdevs_discovered": 2, 00:24:33.691 "num_base_bdevs_operational": 2, 00:24:33.691 "base_bdevs_list": [ 00:24:33.691 { 00:24:33.691 "name": "spare", 00:24:33.691 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:33.691 "is_configured": true, 00:24:33.691 "data_offset": 0, 00:24:33.691 "data_size": 65536 00:24:33.691 }, 00:24:33.691 { 00:24:33.691 "name": "BaseBdev2", 00:24:33.691 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:33.691 "is_configured": true, 00:24:33.691 "data_offset": 0, 00:24:33.691 "data_size": 65536 00:24:33.691 } 00:24:33.691 ] 00:24:33.691 }' 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.691 10:32:10 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.950 10:32:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.950 "name": "raid_bdev1", 00:24:33.950 "uuid": "607eed3a-9b63-43ac-831b-b7b33871ab4c", 00:24:33.950 "strip_size_kb": 0, 00:24:33.950 "state": "online", 00:24:33.950 "raid_level": "raid1", 00:24:33.950 "superblock": false, 00:24:33.950 "num_base_bdevs": 2, 00:24:33.950 "num_base_bdevs_discovered": 2, 00:24:33.950 "num_base_bdevs_operational": 2, 00:24:33.950 "base_bdevs_list": [ 00:24:33.950 { 00:24:33.950 "name": "spare", 00:24:33.950 "uuid": "822f5632-c88d-55fe-a405-525fe6df62ed", 00:24:33.950 "is_configured": true, 00:24:33.950 "data_offset": 0, 00:24:33.950 "data_size": 65536 00:24:33.950 }, 00:24:33.950 { 00:24:33.950 "name": "BaseBdev2", 00:24:33.950 "uuid": "3d3f1701-6a10-5a4a-9b88-0dd8cfdfe05d", 00:24:33.950 "is_configured": true, 00:24:33.950 "data_offset": 0, 00:24:33.950 "data_size": 65536 00:24:33.950 } 00:24:33.950 ] 00:24:33.950 }' 00:24:33.950 10:32:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.950 10:32:11 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:34.518 10:32:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:34.776 [2024-07-15 10:32:11.859595] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:34.776 [2024-07-15 10:32:11.859627] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:34.776 00:24:34.776 Latency(us) 00:24:34.776 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:34.776 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:34.776 raid_bdev1 : 11.60 101.54 304.62 0.00 0.00 13070.42 284.94 119446.48 00:24:34.776 =================================================================================================================== 00:24:34.776 Total : 101.54 304.62 0.00 0.00 13070.42 284.94 119446.48 00:24:34.776 [2024-07-15 10:32:11.883582] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:34.776 [2024-07-15 10:32:11.883609] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:34.776 [2024-07-15 10:32:11.883682] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:34.776 [2024-07-15 10:32:11.883694] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1261070 name raid_bdev1, state offline 00:24:34.776 0 00:24:34.776 10:32:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.776 10:32:11 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.035 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:35.293 /dev/nbd0 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:35.293 1+0 records in 00:24:35.293 1+0 records out 00:24:35.293 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266917 s, 15.3 MB/s 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.293 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:35.552 /dev/nbd1 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:35.552 1+0 records in 00:24:35.552 1+0 records out 00:24:35.552 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269275 s, 15.2 MB/s 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:35.552 10:32:12 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:35.811 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 588523 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 588523 ']' 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 588523 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:36.070 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 588523 00:24:36.328 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:36.328 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:36.328 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 588523' 00:24:36.328 killing process with pid 588523 00:24:36.328 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 588523 00:24:36.328 Received shutdown signal, test time was about 13.002756 seconds 00:24:36.328 00:24:36.328 Latency(us) 00:24:36.328 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:36.328 =================================================================================================================== 00:24:36.328 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:36.328 [2024-07-15 10:32:13.284352] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:36.328 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 588523 00:24:36.328 [2024-07-15 10:32:13.305645] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:36.586 00:24:36.586 real 0m17.738s 00:24:36.586 user 0m27.080s 00:24:36.586 sys 0m2.779s 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:36.586 ************************************ 00:24:36.586 END TEST raid_rebuild_test_io 00:24:36.586 ************************************ 00:24:36.586 10:32:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:36.586 10:32:13 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:24:36.586 10:32:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:36.586 10:32:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:36.586 10:32:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:36.586 ************************************ 00:24:36.586 START TEST raid_rebuild_test_sb_io 00:24:36.586 ************************************ 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:36.586 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=591047 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 591047 /var/tmp/spdk-raid.sock 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 591047 ']' 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:36.587 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:36.587 10:32:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:36.587 [2024-07-15 10:32:13.680039] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:24:36.587 [2024-07-15 10:32:13.680101] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid591047 ] 00:24:36.587 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:36.587 Zero copy mechanism will not be used. 00:24:36.846 [2024-07-15 10:32:13.805462] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:36.846 [2024-07-15 10:32:13.907103] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:36.846 [2024-07-15 10:32:13.968677] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:36.846 [2024-07-15 10:32:13.968713] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:37.414 10:32:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:37.415 10:32:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:24:37.415 10:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:37.415 10:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:37.673 BaseBdev1_malloc 00:24:37.932 10:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:37.932 [2024-07-15 10:32:15.100235] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:37.932 [2024-07-15 10:32:15.100281] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:37.932 [2024-07-15 10:32:15.100305] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1381d40 00:24:37.932 [2024-07-15 10:32:15.100318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:37.932 [2024-07-15 10:32:15.102073] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:37.932 [2024-07-15 10:32:15.102101] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:37.932 BaseBdev1 00:24:37.932 10:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:37.933 10:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:38.192 BaseBdev2_malloc 00:24:38.192 10:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:38.451 [2024-07-15 10:32:15.603666] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:38.451 [2024-07-15 10:32:15.603711] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.451 [2024-07-15 10:32:15.603734] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1382860 00:24:38.451 [2024-07-15 10:32:15.603747] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.451 [2024-07-15 10:32:15.605286] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.451 [2024-07-15 10:32:15.605313] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:38.451 BaseBdev2 00:24:38.451 10:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:38.710 spare_malloc 00:24:38.710 10:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:38.969 spare_delay 00:24:38.969 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:39.228 [2024-07-15 10:32:16.338231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:39.228 [2024-07-15 10:32:16.338278] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.228 [2024-07-15 10:32:16.338297] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1530ec0 00:24:39.228 [2024-07-15 10:32:16.338310] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.228 [2024-07-15 10:32:16.339905] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.228 [2024-07-15 10:32:16.339940] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:39.228 spare 00:24:39.228 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:39.488 [2024-07-15 10:32:16.570868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:39.488 [2024-07-15 10:32:16.572201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:39.488 [2024-07-15 10:32:16.572366] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1532070 00:24:39.488 [2024-07-15 10:32:16.572379] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:39.488 [2024-07-15 10:32:16.572572] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152b490 00:24:39.488 [2024-07-15 10:32:16.572712] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1532070 00:24:39.488 [2024-07-15 10:32:16.572722] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1532070 00:24:39.488 [2024-07-15 10:32:16.572819] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.488 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.747 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.747 "name": "raid_bdev1", 00:24:39.747 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:39.747 "strip_size_kb": 0, 00:24:39.747 "state": "online", 00:24:39.747 "raid_level": "raid1", 00:24:39.747 "superblock": true, 00:24:39.747 "num_base_bdevs": 2, 00:24:39.747 "num_base_bdevs_discovered": 2, 00:24:39.747 "num_base_bdevs_operational": 2, 00:24:39.747 "base_bdevs_list": [ 00:24:39.747 { 00:24:39.747 "name": "BaseBdev1", 00:24:39.747 "uuid": "57ada255-3472-513a-996c-15ec7c084f47", 00:24:39.747 "is_configured": true, 00:24:39.747 "data_offset": 2048, 00:24:39.747 "data_size": 63488 00:24:39.747 }, 00:24:39.747 { 00:24:39.747 "name": "BaseBdev2", 00:24:39.747 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:39.747 "is_configured": true, 00:24:39.747 "data_offset": 2048, 00:24:39.747 "data_size": 63488 00:24:39.747 } 00:24:39.747 ] 00:24:39.747 }' 00:24:39.747 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.747 10:32:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:40.311 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:40.312 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:40.570 [2024-07-15 10:32:17.653986] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:40.570 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:40.570 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.570 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:40.828 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:40.828 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:40.828 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:40.828 10:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:41.086 [2024-07-15 10:32:18.032842] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1532c50 00:24:41.086 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:41.086 Zero copy mechanism will not be used. 00:24:41.086 Running I/O for 60 seconds... 00:24:41.086 [2024-07-15 10:32:18.150322] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:41.086 [2024-07-15 10:32:18.150521] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1532c50 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.086 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.343 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.343 "name": "raid_bdev1", 00:24:41.343 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:41.343 "strip_size_kb": 0, 00:24:41.343 "state": "online", 00:24:41.343 "raid_level": "raid1", 00:24:41.343 "superblock": true, 00:24:41.343 "num_base_bdevs": 2, 00:24:41.343 "num_base_bdevs_discovered": 1, 00:24:41.343 "num_base_bdevs_operational": 1, 00:24:41.343 "base_bdevs_list": [ 00:24:41.343 { 00:24:41.343 "name": null, 00:24:41.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.343 "is_configured": false, 00:24:41.343 "data_offset": 2048, 00:24:41.343 "data_size": 63488 00:24:41.343 }, 00:24:41.343 { 00:24:41.343 "name": "BaseBdev2", 00:24:41.343 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:41.343 "is_configured": true, 00:24:41.343 "data_offset": 2048, 00:24:41.343 "data_size": 63488 00:24:41.343 } 00:24:41.343 ] 00:24:41.343 }' 00:24:41.343 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.343 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:41.908 10:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:42.165 [2024-07-15 10:32:19.143154] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:42.165 10:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:42.165 [2024-07-15 10:32:19.193629] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x149e230 00:24:42.165 [2024-07-15 10:32:19.195980] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:42.165 [2024-07-15 10:32:19.305725] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:42.165 [2024-07-15 10:32:19.306127] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:42.422 [2024-07-15 10:32:19.525795] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:42.422 [2024-07-15 10:32:19.525975] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:42.679 [2024-07-15 10:32:19.776505] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:42.936 [2024-07-15 10:32:19.895559] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:42.936 [2024-07-15 10:32:19.895736] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:43.192 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:43.192 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:43.192 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:43.192 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:43.192 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:43.192 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.192 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.192 [2024-07-15 10:32:20.243310] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:43.192 [2024-07-15 10:32:20.243628] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:43.449 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.449 "name": "raid_bdev1", 00:24:43.449 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:43.449 "strip_size_kb": 0, 00:24:43.449 "state": "online", 00:24:43.449 "raid_level": "raid1", 00:24:43.449 "superblock": true, 00:24:43.449 "num_base_bdevs": 2, 00:24:43.449 "num_base_bdevs_discovered": 2, 00:24:43.449 "num_base_bdevs_operational": 2, 00:24:43.449 "process": { 00:24:43.449 "type": "rebuild", 00:24:43.449 "target": "spare", 00:24:43.449 "progress": { 00:24:43.449 "blocks": 14336, 00:24:43.449 "percent": 22 00:24:43.449 } 00:24:43.449 }, 00:24:43.449 "base_bdevs_list": [ 00:24:43.449 { 00:24:43.449 "name": "spare", 00:24:43.449 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:43.449 "is_configured": true, 00:24:43.449 "data_offset": 2048, 00:24:43.449 "data_size": 63488 00:24:43.449 }, 00:24:43.449 { 00:24:43.449 "name": "BaseBdev2", 00:24:43.449 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:43.449 "is_configured": true, 00:24:43.449 "data_offset": 2048, 00:24:43.449 "data_size": 63488 00:24:43.449 } 00:24:43.449 ] 00:24:43.449 }' 00:24:43.449 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.449 [2024-07-15 10:32:20.488147] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:43.449 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:43.449 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.449 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:43.449 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:43.707 [2024-07-15 10:32:20.764806] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:43.707 [2024-07-15 10:32:20.803712] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:43.965 [2024-07-15 10:32:20.920099] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:43.965 [2024-07-15 10:32:20.922045] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:43.965 [2024-07-15 10:32:20.922073] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:43.965 [2024-07-15 10:32:20.922084] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:43.965 [2024-07-15 10:32:20.936132] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1532c50 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.965 10:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.234 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.234 "name": "raid_bdev1", 00:24:44.234 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:44.234 "strip_size_kb": 0, 00:24:44.234 "state": "online", 00:24:44.234 "raid_level": "raid1", 00:24:44.234 "superblock": true, 00:24:44.234 "num_base_bdevs": 2, 00:24:44.234 "num_base_bdevs_discovered": 1, 00:24:44.234 "num_base_bdevs_operational": 1, 00:24:44.234 "base_bdevs_list": [ 00:24:44.234 { 00:24:44.234 "name": null, 00:24:44.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.234 "is_configured": false, 00:24:44.234 "data_offset": 2048, 00:24:44.234 "data_size": 63488 00:24:44.234 }, 00:24:44.234 { 00:24:44.234 "name": "BaseBdev2", 00:24:44.234 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:44.234 "is_configured": true, 00:24:44.234 "data_offset": 2048, 00:24:44.234 "data_size": 63488 00:24:44.234 } 00:24:44.234 ] 00:24:44.234 }' 00:24:44.234 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.234 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:44.813 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:44.813 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:44.813 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:44.813 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:44.813 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:44.813 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.813 10:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.071 10:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.071 "name": "raid_bdev1", 00:24:45.071 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:45.071 "strip_size_kb": 0, 00:24:45.071 "state": "online", 00:24:45.071 "raid_level": "raid1", 00:24:45.071 "superblock": true, 00:24:45.071 "num_base_bdevs": 2, 00:24:45.071 "num_base_bdevs_discovered": 1, 00:24:45.071 "num_base_bdevs_operational": 1, 00:24:45.071 "base_bdevs_list": [ 00:24:45.071 { 00:24:45.071 "name": null, 00:24:45.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.071 "is_configured": false, 00:24:45.071 "data_offset": 2048, 00:24:45.071 "data_size": 63488 00:24:45.071 }, 00:24:45.071 { 00:24:45.071 "name": "BaseBdev2", 00:24:45.071 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:45.071 "is_configured": true, 00:24:45.071 "data_offset": 2048, 00:24:45.071 "data_size": 63488 00:24:45.071 } 00:24:45.071 ] 00:24:45.071 }' 00:24:45.071 10:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.071 10:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.071 10:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.071 10:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.071 10:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:45.329 [2024-07-15 10:32:22.447224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:45.329 [2024-07-15 10:32:22.507064] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1532e60 00:24:45.329 10:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:45.329 [2024-07-15 10:32:22.508580] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:45.587 [2024-07-15 10:32:22.651709] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:45.587 [2024-07-15 10:32:22.660059] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:45.845 [2024-07-15 10:32:22.907537] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:45.845 [2024-07-15 10:32:22.907789] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:46.103 [2024-07-15 10:32:23.239342] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:46.360 [2024-07-15 10:32:23.450008] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:46.360 [2024-07-15 10:32:23.450165] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:46.360 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.360 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.360 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.360 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.360 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.360 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.360 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.618 [2024-07-15 10:32:23.696687] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:46.618 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:46.618 "name": "raid_bdev1", 00:24:46.618 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:46.618 "strip_size_kb": 0, 00:24:46.618 "state": "online", 00:24:46.618 "raid_level": "raid1", 00:24:46.618 "superblock": true, 00:24:46.618 "num_base_bdevs": 2, 00:24:46.618 "num_base_bdevs_discovered": 2, 00:24:46.618 "num_base_bdevs_operational": 2, 00:24:46.618 "process": { 00:24:46.618 "type": "rebuild", 00:24:46.618 "target": "spare", 00:24:46.618 "progress": { 00:24:46.618 "blocks": 14336, 00:24:46.618 "percent": 22 00:24:46.618 } 00:24:46.618 }, 00:24:46.618 "base_bdevs_list": [ 00:24:46.618 { 00:24:46.618 "name": "spare", 00:24:46.618 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:46.618 "is_configured": true, 00:24:46.618 "data_offset": 2048, 00:24:46.618 "data_size": 63488 00:24:46.618 }, 00:24:46.618 { 00:24:46.618 "name": "BaseBdev2", 00:24:46.618 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:46.618 "is_configured": true, 00:24:46.618 "data_offset": 2048, 00:24:46.618 "data_size": 63488 00:24:46.618 } 00:24:46.618 ] 00:24:46.618 }' 00:24:46.618 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:46.618 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:46.618 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:46.876 [2024-07-15 10:32:23.833103] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:46.876 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=828 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.876 10:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.134 10:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.134 "name": "raid_bdev1", 00:24:47.134 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:47.134 "strip_size_kb": 0, 00:24:47.134 "state": "online", 00:24:47.134 "raid_level": "raid1", 00:24:47.134 "superblock": true, 00:24:47.134 "num_base_bdevs": 2, 00:24:47.134 "num_base_bdevs_discovered": 2, 00:24:47.134 "num_base_bdevs_operational": 2, 00:24:47.134 "process": { 00:24:47.134 "type": "rebuild", 00:24:47.134 "target": "spare", 00:24:47.134 "progress": { 00:24:47.134 "blocks": 18432, 00:24:47.134 "percent": 29 00:24:47.134 } 00:24:47.134 }, 00:24:47.134 "base_bdevs_list": [ 00:24:47.134 { 00:24:47.134 "name": "spare", 00:24:47.134 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:47.134 "is_configured": true, 00:24:47.134 "data_offset": 2048, 00:24:47.134 "data_size": 63488 00:24:47.134 }, 00:24:47.134 { 00:24:47.134 "name": "BaseBdev2", 00:24:47.134 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:47.134 "is_configured": true, 00:24:47.134 "data_offset": 2048, 00:24:47.134 "data_size": 63488 00:24:47.134 } 00:24:47.134 ] 00:24:47.134 }' 00:24:47.134 10:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.134 10:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:47.134 10:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.134 10:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:47.134 10:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:47.134 [2024-07-15 10:32:24.216909] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:47.393 [2024-07-15 10:32:24.363309] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:47.651 [2024-07-15 10:32:24.718282] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:47.909 [2024-07-15 10:32:24.928706] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.168 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.426 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.426 "name": "raid_bdev1", 00:24:48.426 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:48.426 "strip_size_kb": 0, 00:24:48.426 "state": "online", 00:24:48.426 "raid_level": "raid1", 00:24:48.426 "superblock": true, 00:24:48.426 "num_base_bdevs": 2, 00:24:48.426 "num_base_bdevs_discovered": 2, 00:24:48.426 "num_base_bdevs_operational": 2, 00:24:48.426 "process": { 00:24:48.426 "type": "rebuild", 00:24:48.426 "target": "spare", 00:24:48.426 "progress": { 00:24:48.426 "blocks": 34816, 00:24:48.426 "percent": 54 00:24:48.426 } 00:24:48.426 }, 00:24:48.426 "base_bdevs_list": [ 00:24:48.426 { 00:24:48.426 "name": "spare", 00:24:48.426 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:48.426 "is_configured": true, 00:24:48.426 "data_offset": 2048, 00:24:48.426 "data_size": 63488 00:24:48.426 }, 00:24:48.426 { 00:24:48.426 "name": "BaseBdev2", 00:24:48.426 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:48.426 "is_configured": true, 00:24:48.426 "data_offset": 2048, 00:24:48.426 "data_size": 63488 00:24:48.426 } 00:24:48.426 ] 00:24:48.426 }' 00:24:48.426 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.426 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.426 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.426 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.426 10:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:48.685 [2024-07-15 10:32:25.752780] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.620 [2024-07-15 10:32:26.776312] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.620 "name": "raid_bdev1", 00:24:49.620 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:49.620 "strip_size_kb": 0, 00:24:49.620 "state": "online", 00:24:49.620 "raid_level": "raid1", 00:24:49.620 "superblock": true, 00:24:49.620 "num_base_bdevs": 2, 00:24:49.620 "num_base_bdevs_discovered": 2, 00:24:49.620 "num_base_bdevs_operational": 2, 00:24:49.620 "process": { 00:24:49.620 "type": "rebuild", 00:24:49.620 "target": "spare", 00:24:49.620 "progress": { 00:24:49.620 "blocks": 55296, 00:24:49.620 "percent": 87 00:24:49.620 } 00:24:49.620 }, 00:24:49.620 "base_bdevs_list": [ 00:24:49.620 { 00:24:49.620 "name": "spare", 00:24:49.620 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:49.620 "is_configured": true, 00:24:49.620 "data_offset": 2048, 00:24:49.620 "data_size": 63488 00:24:49.620 }, 00:24:49.620 { 00:24:49.620 "name": "BaseBdev2", 00:24:49.620 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:49.620 "is_configured": true, 00:24:49.620 "data_offset": 2048, 00:24:49.620 "data_size": 63488 00:24:49.620 } 00:24:49.620 ] 00:24:49.620 }' 00:24:49.620 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.878 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:49.878 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:49.878 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:49.878 10:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:50.136 [2024-07-15 10:32:27.114820] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:50.136 [2024-07-15 10:32:27.223061] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:50.136 [2024-07-15 10:32:27.224661] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.702 10:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.961 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:50.961 "name": "raid_bdev1", 00:24:50.961 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:50.961 "strip_size_kb": 0, 00:24:50.961 "state": "online", 00:24:50.961 "raid_level": "raid1", 00:24:50.961 "superblock": true, 00:24:50.961 "num_base_bdevs": 2, 00:24:50.961 "num_base_bdevs_discovered": 2, 00:24:50.961 "num_base_bdevs_operational": 2, 00:24:50.961 "base_bdevs_list": [ 00:24:50.961 { 00:24:50.961 "name": "spare", 00:24:50.961 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:50.961 "is_configured": true, 00:24:50.961 "data_offset": 2048, 00:24:50.961 "data_size": 63488 00:24:50.961 }, 00:24:50.961 { 00:24:50.961 "name": "BaseBdev2", 00:24:50.961 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:50.961 "is_configured": true, 00:24:50.961 "data_offset": 2048, 00:24:50.961 "data_size": 63488 00:24:50.961 } 00:24:50.961 ] 00:24:50.961 }' 00:24:50.961 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.219 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.477 "name": "raid_bdev1", 00:24:51.477 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:51.477 "strip_size_kb": 0, 00:24:51.477 "state": "online", 00:24:51.477 "raid_level": "raid1", 00:24:51.477 "superblock": true, 00:24:51.477 "num_base_bdevs": 2, 00:24:51.477 "num_base_bdevs_discovered": 2, 00:24:51.477 "num_base_bdevs_operational": 2, 00:24:51.477 "base_bdevs_list": [ 00:24:51.477 { 00:24:51.477 "name": "spare", 00:24:51.477 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:51.477 "is_configured": true, 00:24:51.477 "data_offset": 2048, 00:24:51.477 "data_size": 63488 00:24:51.477 }, 00:24:51.477 { 00:24:51.477 "name": "BaseBdev2", 00:24:51.477 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:51.477 "is_configured": true, 00:24:51.477 "data_offset": 2048, 00:24:51.477 "data_size": 63488 00:24:51.477 } 00:24:51.477 ] 00:24:51.477 }' 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.477 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.735 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.735 "name": "raid_bdev1", 00:24:51.735 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:51.735 "strip_size_kb": 0, 00:24:51.735 "state": "online", 00:24:51.735 "raid_level": "raid1", 00:24:51.735 "superblock": true, 00:24:51.735 "num_base_bdevs": 2, 00:24:51.735 "num_base_bdevs_discovered": 2, 00:24:51.735 "num_base_bdevs_operational": 2, 00:24:51.735 "base_bdevs_list": [ 00:24:51.735 { 00:24:51.735 "name": "spare", 00:24:51.735 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:51.735 "is_configured": true, 00:24:51.735 "data_offset": 2048, 00:24:51.735 "data_size": 63488 00:24:51.735 }, 00:24:51.735 { 00:24:51.735 "name": "BaseBdev2", 00:24:51.735 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:51.735 "is_configured": true, 00:24:51.735 "data_offset": 2048, 00:24:51.735 "data_size": 63488 00:24:51.735 } 00:24:51.735 ] 00:24:51.735 }' 00:24:51.735 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.735 10:32:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:52.302 10:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:52.560 [2024-07-15 10:32:29.651256] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:52.560 [2024-07-15 10:32:29.651292] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:52.560 00:24:52.560 Latency(us) 00:24:52.560 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:52.560 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:52.560 raid_bdev1 : 11.69 94.12 282.35 0.00 0.00 14374.07 300.97 110784.33 00:24:52.560 =================================================================================================================== 00:24:52.560 Total : 94.12 282.35 0.00 0.00 14374.07 300.97 110784.33 00:24:52.560 [2024-07-15 10:32:29.755543] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:52.560 [2024-07-15 10:32:29.755592] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:52.560 [2024-07-15 10:32:29.755666] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:52.560 [2024-07-15 10:32:29.755679] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1532070 name raid_bdev1, state offline 00:24:52.560 0 00:24:52.819 10:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.819 10:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.078 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:53.078 /dev/nbd0 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:53.337 1+0 records in 00:24:53.337 1+0 records out 00:24:53.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000423922 s, 9.7 MB/s 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.337 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:24:53.596 /dev/nbd1 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:53.596 1+0 records in 00:24:53.596 1+0 records out 00:24:53.596 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312538 s, 13.1 MB/s 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:53.596 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:53.855 10:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:54.113 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:54.371 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:54.629 [2024-07-15 10:32:31.700786] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:54.629 [2024-07-15 10:32:31.700832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.629 [2024-07-15 10:32:31.700856] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1392e70 00:24:54.629 [2024-07-15 10:32:31.700869] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.629 [2024-07-15 10:32:31.702493] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.629 [2024-07-15 10:32:31.702521] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:54.629 [2024-07-15 10:32:31.702598] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:54.629 [2024-07-15 10:32:31.702627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:54.629 [2024-07-15 10:32:31.702729] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:54.629 spare 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.629 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.629 [2024-07-15 10:32:31.803046] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15322f0 00:24:54.629 [2024-07-15 10:32:31.803061] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:54.629 [2024-07-15 10:32:31.803242] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152e670 00:24:54.629 [2024-07-15 10:32:31.803380] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15322f0 00:24:54.629 [2024-07-15 10:32:31.803390] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15322f0 00:24:54.629 [2024-07-15 10:32:31.803492] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:54.887 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.887 "name": "raid_bdev1", 00:24:54.887 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:54.887 "strip_size_kb": 0, 00:24:54.887 "state": "online", 00:24:54.887 "raid_level": "raid1", 00:24:54.887 "superblock": true, 00:24:54.887 "num_base_bdevs": 2, 00:24:54.887 "num_base_bdevs_discovered": 2, 00:24:54.887 "num_base_bdevs_operational": 2, 00:24:54.887 "base_bdevs_list": [ 00:24:54.887 { 00:24:54.887 "name": "spare", 00:24:54.887 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:54.887 "is_configured": true, 00:24:54.887 "data_offset": 2048, 00:24:54.887 "data_size": 63488 00:24:54.887 }, 00:24:54.887 { 00:24:54.887 "name": "BaseBdev2", 00:24:54.887 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:54.887 "is_configured": true, 00:24:54.887 "data_offset": 2048, 00:24:54.887 "data_size": 63488 00:24:54.887 } 00:24:54.887 ] 00:24:54.887 }' 00:24:54.887 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.887 10:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:55.453 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:55.453 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.453 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:55.453 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:55.453 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.453 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.453 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.709 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.710 "name": "raid_bdev1", 00:24:55.710 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:55.710 "strip_size_kb": 0, 00:24:55.710 "state": "online", 00:24:55.710 "raid_level": "raid1", 00:24:55.710 "superblock": true, 00:24:55.710 "num_base_bdevs": 2, 00:24:55.710 "num_base_bdevs_discovered": 2, 00:24:55.710 "num_base_bdevs_operational": 2, 00:24:55.710 "base_bdevs_list": [ 00:24:55.710 { 00:24:55.710 "name": "spare", 00:24:55.710 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:55.710 "is_configured": true, 00:24:55.710 "data_offset": 2048, 00:24:55.710 "data_size": 63488 00:24:55.710 }, 00:24:55.710 { 00:24:55.710 "name": "BaseBdev2", 00:24:55.710 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:55.710 "is_configured": true, 00:24:55.710 "data_offset": 2048, 00:24:55.710 "data_size": 63488 00:24:55.710 } 00:24:55.710 ] 00:24:55.710 }' 00:24:55.710 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:55.710 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:55.710 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:55.966 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:55.966 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.966 10:32:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:56.222 [2024-07-15 10:32:33.393605] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.222 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.477 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.477 "name": "raid_bdev1", 00:24:56.477 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:56.477 "strip_size_kb": 0, 00:24:56.477 "state": "online", 00:24:56.477 "raid_level": "raid1", 00:24:56.477 "superblock": true, 00:24:56.477 "num_base_bdevs": 2, 00:24:56.477 "num_base_bdevs_discovered": 1, 00:24:56.477 "num_base_bdevs_operational": 1, 00:24:56.477 "base_bdevs_list": [ 00:24:56.477 { 00:24:56.477 "name": null, 00:24:56.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.477 "is_configured": false, 00:24:56.477 "data_offset": 2048, 00:24:56.477 "data_size": 63488 00:24:56.477 }, 00:24:56.477 { 00:24:56.477 "name": "BaseBdev2", 00:24:56.477 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:56.477 "is_configured": true, 00:24:56.477 "data_offset": 2048, 00:24:56.477 "data_size": 63488 00:24:56.477 } 00:24:56.477 ] 00:24:56.477 }' 00:24:56.477 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.477 10:32:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:57.410 10:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:57.411 [2024-07-15 10:32:34.468615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:57.411 [2024-07-15 10:32:34.468775] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:57.411 [2024-07-15 10:32:34.468793] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:57.411 [2024-07-15 10:32:34.468822] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:57.411 [2024-07-15 10:32:34.474121] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152b490 00:24:57.411 [2024-07-15 10:32:34.476450] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:57.411 10:32:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:58.343 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.343 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.343 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.343 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.343 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.343 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.343 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.659 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.659 "name": "raid_bdev1", 00:24:58.659 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:58.659 "strip_size_kb": 0, 00:24:58.659 "state": "online", 00:24:58.659 "raid_level": "raid1", 00:24:58.659 "superblock": true, 00:24:58.659 "num_base_bdevs": 2, 00:24:58.659 "num_base_bdevs_discovered": 2, 00:24:58.659 "num_base_bdevs_operational": 2, 00:24:58.659 "process": { 00:24:58.659 "type": "rebuild", 00:24:58.659 "target": "spare", 00:24:58.659 "progress": { 00:24:58.659 "blocks": 24576, 00:24:58.659 "percent": 38 00:24:58.659 } 00:24:58.659 }, 00:24:58.659 "base_bdevs_list": [ 00:24:58.659 { 00:24:58.659 "name": "spare", 00:24:58.659 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:24:58.659 "is_configured": true, 00:24:58.659 "data_offset": 2048, 00:24:58.659 "data_size": 63488 00:24:58.659 }, 00:24:58.659 { 00:24:58.659 "name": "BaseBdev2", 00:24:58.659 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:58.659 "is_configured": true, 00:24:58.659 "data_offset": 2048, 00:24:58.659 "data_size": 63488 00:24:58.659 } 00:24:58.659 ] 00:24:58.659 }' 00:24:58.659 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.659 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:58.659 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.918 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:58.918 10:32:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:58.918 [2024-07-15 10:32:36.071038] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:58.919 [2024-07-15 10:32:36.089392] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:58.919 [2024-07-15 10:32:36.089436] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:58.919 [2024-07-15 10:32:36.089451] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:58.919 [2024-07-15 10:32:36.089460] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.175 "name": "raid_bdev1", 00:24:59.175 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:24:59.175 "strip_size_kb": 0, 00:24:59.175 "state": "online", 00:24:59.175 "raid_level": "raid1", 00:24:59.175 "superblock": true, 00:24:59.175 "num_base_bdevs": 2, 00:24:59.175 "num_base_bdevs_discovered": 1, 00:24:59.175 "num_base_bdevs_operational": 1, 00:24:59.175 "base_bdevs_list": [ 00:24:59.175 { 00:24:59.175 "name": null, 00:24:59.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.175 "is_configured": false, 00:24:59.175 "data_offset": 2048, 00:24:59.175 "data_size": 63488 00:24:59.175 }, 00:24:59.175 { 00:24:59.175 "name": "BaseBdev2", 00:24:59.175 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:24:59.175 "is_configured": true, 00:24:59.175 "data_offset": 2048, 00:24:59.175 "data_size": 63488 00:24:59.175 } 00:24:59.175 ] 00:24:59.175 }' 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.175 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:59.739 10:32:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:59.996 [2024-07-15 10:32:37.069509] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:59.996 [2024-07-15 10:32:37.069563] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:59.996 [2024-07-15 10:32:37.069587] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1533650 00:24:59.996 [2024-07-15 10:32:37.069599] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:59.996 [2024-07-15 10:32:37.069976] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:59.996 [2024-07-15 10:32:37.069995] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:59.996 [2024-07-15 10:32:37.070080] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:59.996 [2024-07-15 10:32:37.070093] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:59.996 [2024-07-15 10:32:37.070111] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:59.996 [2024-07-15 10:32:37.070130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:59.996 [2024-07-15 10:32:37.075473] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1392d00 00:24:59.996 spare 00:24:59.996 [2024-07-15 10:32:37.076939] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:59.996 10:32:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:00.928 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.928 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.928 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.928 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.928 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.928 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.928 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.185 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.185 "name": "raid_bdev1", 00:25:01.185 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:25:01.185 "strip_size_kb": 0, 00:25:01.185 "state": "online", 00:25:01.185 "raid_level": "raid1", 00:25:01.185 "superblock": true, 00:25:01.185 "num_base_bdevs": 2, 00:25:01.185 "num_base_bdevs_discovered": 2, 00:25:01.185 "num_base_bdevs_operational": 2, 00:25:01.185 "process": { 00:25:01.185 "type": "rebuild", 00:25:01.185 "target": "spare", 00:25:01.185 "progress": { 00:25:01.185 "blocks": 24576, 00:25:01.185 "percent": 38 00:25:01.185 } 00:25:01.185 }, 00:25:01.185 "base_bdevs_list": [ 00:25:01.185 { 00:25:01.185 "name": "spare", 00:25:01.185 "uuid": "25bdca8a-6b1f-55e0-bcd8-557c46ad07c4", 00:25:01.185 "is_configured": true, 00:25:01.185 "data_offset": 2048, 00:25:01.185 "data_size": 63488 00:25:01.185 }, 00:25:01.185 { 00:25:01.185 "name": "BaseBdev2", 00:25:01.185 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:25:01.185 "is_configured": true, 00:25:01.185 "data_offset": 2048, 00:25:01.185 "data_size": 63488 00:25:01.185 } 00:25:01.185 ] 00:25:01.185 }' 00:25:01.185 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.443 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.443 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.443 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.443 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:01.700 [2024-07-15 10:32:38.673347] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:01.700 [2024-07-15 10:32:38.689825] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:01.700 [2024-07-15 10:32:38.689871] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:01.700 [2024-07-15 10:32:38.689886] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:01.700 [2024-07-15 10:32:38.689895] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.700 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.957 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:01.957 "name": "raid_bdev1", 00:25:01.957 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:25:01.957 "strip_size_kb": 0, 00:25:01.957 "state": "online", 00:25:01.957 "raid_level": "raid1", 00:25:01.957 "superblock": true, 00:25:01.957 "num_base_bdevs": 2, 00:25:01.957 "num_base_bdevs_discovered": 1, 00:25:01.957 "num_base_bdevs_operational": 1, 00:25:01.957 "base_bdevs_list": [ 00:25:01.957 { 00:25:01.957 "name": null, 00:25:01.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.957 "is_configured": false, 00:25:01.957 "data_offset": 2048, 00:25:01.957 "data_size": 63488 00:25:01.957 }, 00:25:01.957 { 00:25:01.957 "name": "BaseBdev2", 00:25:01.957 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:25:01.957 "is_configured": true, 00:25:01.957 "data_offset": 2048, 00:25:01.957 "data_size": 63488 00:25:01.957 } 00:25:01.957 ] 00:25:01.957 }' 00:25:01.957 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:01.957 10:32:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:02.521 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:02.521 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:02.521 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:02.521 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:02.521 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:02.521 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.521 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.779 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:02.779 "name": "raid_bdev1", 00:25:02.779 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:25:02.779 "strip_size_kb": 0, 00:25:02.779 "state": "online", 00:25:02.779 "raid_level": "raid1", 00:25:02.779 "superblock": true, 00:25:02.779 "num_base_bdevs": 2, 00:25:02.779 "num_base_bdevs_discovered": 1, 00:25:02.779 "num_base_bdevs_operational": 1, 00:25:02.779 "base_bdevs_list": [ 00:25:02.779 { 00:25:02.779 "name": null, 00:25:02.779 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.779 "is_configured": false, 00:25:02.779 "data_offset": 2048, 00:25:02.779 "data_size": 63488 00:25:02.779 }, 00:25:02.779 { 00:25:02.779 "name": "BaseBdev2", 00:25:02.779 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:25:02.779 "is_configured": true, 00:25:02.779 "data_offset": 2048, 00:25:02.779 "data_size": 63488 00:25:02.779 } 00:25:02.779 ] 00:25:02.779 }' 00:25:02.779 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:02.779 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:02.779 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:02.779 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:02.779 10:32:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:03.035 10:32:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:03.291 [2024-07-15 10:32:40.283725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:03.291 [2024-07-15 10:32:40.283776] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:03.291 [2024-07-15 10:32:40.283807] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1381490 00:25:03.291 [2024-07-15 10:32:40.283820] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:03.291 [2024-07-15 10:32:40.284170] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:03.291 [2024-07-15 10:32:40.284187] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:03.291 [2024-07-15 10:32:40.284252] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:03.291 [2024-07-15 10:32:40.284265] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:03.291 [2024-07-15 10:32:40.284276] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:03.291 BaseBdev1 00:25:03.292 10:32:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.223 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.480 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.480 "name": "raid_bdev1", 00:25:04.480 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:25:04.480 "strip_size_kb": 0, 00:25:04.480 "state": "online", 00:25:04.480 "raid_level": "raid1", 00:25:04.480 "superblock": true, 00:25:04.480 "num_base_bdevs": 2, 00:25:04.480 "num_base_bdevs_discovered": 1, 00:25:04.480 "num_base_bdevs_operational": 1, 00:25:04.480 "base_bdevs_list": [ 00:25:04.480 { 00:25:04.480 "name": null, 00:25:04.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.480 "is_configured": false, 00:25:04.480 "data_offset": 2048, 00:25:04.480 "data_size": 63488 00:25:04.480 }, 00:25:04.480 { 00:25:04.480 "name": "BaseBdev2", 00:25:04.480 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:25:04.480 "is_configured": true, 00:25:04.480 "data_offset": 2048, 00:25:04.480 "data_size": 63488 00:25:04.480 } 00:25:04.480 ] 00:25:04.480 }' 00:25:04.480 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.480 10:32:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:05.041 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:05.041 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:05.041 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:05.041 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:05.041 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:05.041 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.041 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:05.297 "name": "raid_bdev1", 00:25:05.297 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:25:05.297 "strip_size_kb": 0, 00:25:05.297 "state": "online", 00:25:05.297 "raid_level": "raid1", 00:25:05.297 "superblock": true, 00:25:05.297 "num_base_bdevs": 2, 00:25:05.297 "num_base_bdevs_discovered": 1, 00:25:05.297 "num_base_bdevs_operational": 1, 00:25:05.297 "base_bdevs_list": [ 00:25:05.297 { 00:25:05.297 "name": null, 00:25:05.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.297 "is_configured": false, 00:25:05.297 "data_offset": 2048, 00:25:05.297 "data_size": 63488 00:25:05.297 }, 00:25:05.297 { 00:25:05.297 "name": "BaseBdev2", 00:25:05.297 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:25:05.297 "is_configured": true, 00:25:05.297 "data_offset": 2048, 00:25:05.297 "data_size": 63488 00:25:05.297 } 00:25:05.297 ] 00:25:05.297 }' 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:05.297 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:05.860 [2024-07-15 10:32:42.887021] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:05.860 [2024-07-15 10:32:42.887152] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:05.860 [2024-07-15 10:32:42.887168] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:05.860 request: 00:25:05.860 { 00:25:05.860 "base_bdev": "BaseBdev1", 00:25:05.860 "raid_bdev": "raid_bdev1", 00:25:05.860 "method": "bdev_raid_add_base_bdev", 00:25:05.860 "req_id": 1 00:25:05.860 } 00:25:05.860 Got JSON-RPC error response 00:25:05.860 response: 00:25:05.860 { 00:25:05.860 "code": -22, 00:25:05.860 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:05.860 } 00:25:05.860 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:25:05.860 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:05.860 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:05.860 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:05.860 10:32:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.802 10:32:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.060 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.060 "name": "raid_bdev1", 00:25:07.060 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:25:07.060 "strip_size_kb": 0, 00:25:07.060 "state": "online", 00:25:07.060 "raid_level": "raid1", 00:25:07.060 "superblock": true, 00:25:07.060 "num_base_bdevs": 2, 00:25:07.060 "num_base_bdevs_discovered": 1, 00:25:07.060 "num_base_bdevs_operational": 1, 00:25:07.060 "base_bdevs_list": [ 00:25:07.060 { 00:25:07.060 "name": null, 00:25:07.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.060 "is_configured": false, 00:25:07.060 "data_offset": 2048, 00:25:07.060 "data_size": 63488 00:25:07.060 }, 00:25:07.060 { 00:25:07.060 "name": "BaseBdev2", 00:25:07.060 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:25:07.060 "is_configured": true, 00:25:07.060 "data_offset": 2048, 00:25:07.060 "data_size": 63488 00:25:07.060 } 00:25:07.060 ] 00:25:07.060 }' 00:25:07.060 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.060 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:07.625 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:07.625 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:07.625 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:07.625 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:07.625 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:07.625 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.625 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.882 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.882 "name": "raid_bdev1", 00:25:07.882 "uuid": "35af3edc-a0a0-4625-9daa-b7311bd24608", 00:25:07.882 "strip_size_kb": 0, 00:25:07.882 "state": "online", 00:25:07.882 "raid_level": "raid1", 00:25:07.882 "superblock": true, 00:25:07.882 "num_base_bdevs": 2, 00:25:07.882 "num_base_bdevs_discovered": 1, 00:25:07.882 "num_base_bdevs_operational": 1, 00:25:07.882 "base_bdevs_list": [ 00:25:07.882 { 00:25:07.882 "name": null, 00:25:07.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.882 "is_configured": false, 00:25:07.882 "data_offset": 2048, 00:25:07.882 "data_size": 63488 00:25:07.882 }, 00:25:07.882 { 00:25:07.882 "name": "BaseBdev2", 00:25:07.882 "uuid": "d64d8875-8701-59b4-be61-9b2a732679bc", 00:25:07.882 "is_configured": true, 00:25:07.882 "data_offset": 2048, 00:25:07.882 "data_size": 63488 00:25:07.882 } 00:25:07.882 ] 00:25:07.882 }' 00:25:07.882 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.882 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:07.882 10:32:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 591047 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 591047 ']' 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 591047 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 591047 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 591047' 00:25:07.882 killing process with pid 591047 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 591047 00:25:07.882 Received shutdown signal, test time was about 26.965814 seconds 00:25:07.882 00:25:07.882 Latency(us) 00:25:07.882 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:07.882 =================================================================================================================== 00:25:07.882 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:07.882 [2024-07-15 10:32:45.066721] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:07.882 [2024-07-15 10:32:45.066815] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:07.882 [2024-07-15 10:32:45.066861] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:07.882 [2024-07-15 10:32:45.066873] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15322f0 name raid_bdev1, state offline 00:25:07.882 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 591047 00:25:08.139 [2024-07-15 10:32:45.089112] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:08.139 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:08.139 00:25:08.139 real 0m31.692s 00:25:08.139 user 0m49.501s 00:25:08.139 sys 0m4.628s 00:25:08.139 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:08.139 10:32:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:08.139 ************************************ 00:25:08.139 END TEST raid_rebuild_test_sb_io 00:25:08.139 ************************************ 00:25:08.396 10:32:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:08.396 10:32:45 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:25:08.396 10:32:45 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:25:08.396 10:32:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:08.396 10:32:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:08.396 10:32:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:08.396 ************************************ 00:25:08.396 START TEST raid_rebuild_test 00:25:08.396 ************************************ 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=595546 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 595546 /var/tmp/spdk-raid.sock 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 595546 ']' 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:08.396 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:08.396 10:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:08.396 [2024-07-15 10:32:45.441876] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:25:08.396 [2024-07-15 10:32:45.441947] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid595546 ] 00:25:08.396 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:08.396 Zero copy mechanism will not be used. 00:25:08.396 [2024-07-15 10:32:45.570990] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:08.653 [2024-07-15 10:32:45.673811] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:08.654 [2024-07-15 10:32:45.733941] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:08.654 [2024-07-15 10:32:45.733978] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:09.259 10:32:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:09.259 10:32:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:25:09.259 10:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:09.259 10:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:09.516 BaseBdev1_malloc 00:25:09.516 10:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:09.774 [2024-07-15 10:32:46.816871] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:09.774 [2024-07-15 10:32:46.816923] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:09.774 [2024-07-15 10:32:46.816961] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2016d40 00:25:09.774 [2024-07-15 10:32:46.816975] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:09.774 [2024-07-15 10:32:46.818751] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:09.774 [2024-07-15 10:32:46.818781] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:09.774 BaseBdev1 00:25:09.774 10:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:09.774 10:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:10.339 BaseBdev2_malloc 00:25:10.339 10:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:10.596 [2024-07-15 10:32:47.571692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:10.596 [2024-07-15 10:32:47.571740] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:10.596 [2024-07-15 10:32:47.571767] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2017860 00:25:10.596 [2024-07-15 10:32:47.571780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:10.596 [2024-07-15 10:32:47.573333] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:10.596 [2024-07-15 10:32:47.573360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:10.596 BaseBdev2 00:25:10.596 10:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:10.596 10:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:10.854 BaseBdev3_malloc 00:25:10.854 10:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:10.854 [2024-07-15 10:32:48.042739] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:10.854 [2024-07-15 10:32:48.042786] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:10.854 [2024-07-15 10:32:48.042811] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21c48f0 00:25:10.854 [2024-07-15 10:32:48.042824] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:10.854 [2024-07-15 10:32:48.044365] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:10.854 [2024-07-15 10:32:48.044393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:10.854 BaseBdev3 00:25:11.111 10:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:11.111 10:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:11.369 BaseBdev4_malloc 00:25:11.627 10:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:11.627 [2024-07-15 10:32:48.806559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:11.627 [2024-07-15 10:32:48.806607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:11.627 [2024-07-15 10:32:48.806632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21c3ad0 00:25:11.627 [2024-07-15 10:32:48.806644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:11.627 [2024-07-15 10:32:48.808168] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:11.627 [2024-07-15 10:32:48.808195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:11.627 BaseBdev4 00:25:11.885 10:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:11.885 spare_malloc 00:25:11.885 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:12.144 spare_delay 00:25:12.144 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:12.401 [2024-07-15 10:32:49.504985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:12.401 [2024-07-15 10:32:49.505033] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:12.401 [2024-07-15 10:32:49.505059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21c85b0 00:25:12.401 [2024-07-15 10:32:49.505072] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:12.401 [2024-07-15 10:32:49.506685] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:12.401 [2024-07-15 10:32:49.506714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:12.401 spare 00:25:12.401 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:12.662 [2024-07-15 10:32:49.669436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:12.662 [2024-07-15 10:32:49.670788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:12.662 [2024-07-15 10:32:49.670844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:12.662 [2024-07-15 10:32:49.670890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:12.662 [2024-07-15 10:32:49.670985] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21478a0 00:25:12.662 [2024-07-15 10:32:49.670996] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:12.662 [2024-07-15 10:32:49.671217] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21c1e10 00:25:12.662 [2024-07-15 10:32:49.671370] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21478a0 00:25:12.662 [2024-07-15 10:32:49.671381] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21478a0 00:25:12.662 [2024-07-15 10:32:49.671496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.662 10:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.264 10:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:13.264 "name": "raid_bdev1", 00:25:13.264 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:13.264 "strip_size_kb": 0, 00:25:13.264 "state": "online", 00:25:13.264 "raid_level": "raid1", 00:25:13.264 "superblock": false, 00:25:13.264 "num_base_bdevs": 4, 00:25:13.264 "num_base_bdevs_discovered": 4, 00:25:13.264 "num_base_bdevs_operational": 4, 00:25:13.264 "base_bdevs_list": [ 00:25:13.264 { 00:25:13.264 "name": "BaseBdev1", 00:25:13.264 "uuid": "cc9e6397-7de1-549c-b66d-a43b3a0ab7a5", 00:25:13.264 "is_configured": true, 00:25:13.264 "data_offset": 0, 00:25:13.264 "data_size": 65536 00:25:13.264 }, 00:25:13.264 { 00:25:13.264 "name": "BaseBdev2", 00:25:13.264 "uuid": "3f563881-3be7-5f99-bd8e-413c6c88ad58", 00:25:13.264 "is_configured": true, 00:25:13.264 "data_offset": 0, 00:25:13.264 "data_size": 65536 00:25:13.264 }, 00:25:13.264 { 00:25:13.264 "name": "BaseBdev3", 00:25:13.264 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:13.264 "is_configured": true, 00:25:13.264 "data_offset": 0, 00:25:13.264 "data_size": 65536 00:25:13.264 }, 00:25:13.264 { 00:25:13.264 "name": "BaseBdev4", 00:25:13.264 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:13.264 "is_configured": true, 00:25:13.264 "data_offset": 0, 00:25:13.264 "data_size": 65536 00:25:13.264 } 00:25:13.264 ] 00:25:13.264 }' 00:25:13.264 10:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:13.264 10:32:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:13.829 10:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:13.829 10:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:13.829 [2024-07-15 10:32:50.997248] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:13.829 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:13.829 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.829 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:14.394 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:14.652 [2024-07-15 10:32:51.750978] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21c1e10 00:25:14.652 /dev/nbd0 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:14.652 1+0 records in 00:25:14.652 1+0 records out 00:25:14.652 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264005 s, 15.5 MB/s 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:14.652 10:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:25:22.758 65536+0 records in 00:25:22.758 65536+0 records out 00:25:22.758 33554432 bytes (34 MB, 32 MiB) copied, 7.20608 s, 4.7 MB/s 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:22.758 [2024-07-15 10:32:59.288021] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:22.758 [2024-07-15 10:32:59.520685] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:22.758 "name": "raid_bdev1", 00:25:22.758 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:22.758 "strip_size_kb": 0, 00:25:22.758 "state": "online", 00:25:22.758 "raid_level": "raid1", 00:25:22.758 "superblock": false, 00:25:22.758 "num_base_bdevs": 4, 00:25:22.758 "num_base_bdevs_discovered": 3, 00:25:22.758 "num_base_bdevs_operational": 3, 00:25:22.758 "base_bdevs_list": [ 00:25:22.758 { 00:25:22.758 "name": null, 00:25:22.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:22.758 "is_configured": false, 00:25:22.758 "data_offset": 0, 00:25:22.758 "data_size": 65536 00:25:22.758 }, 00:25:22.758 { 00:25:22.758 "name": "BaseBdev2", 00:25:22.758 "uuid": "3f563881-3be7-5f99-bd8e-413c6c88ad58", 00:25:22.758 "is_configured": true, 00:25:22.758 "data_offset": 0, 00:25:22.758 "data_size": 65536 00:25:22.758 }, 00:25:22.758 { 00:25:22.758 "name": "BaseBdev3", 00:25:22.758 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:22.758 "is_configured": true, 00:25:22.758 "data_offset": 0, 00:25:22.758 "data_size": 65536 00:25:22.758 }, 00:25:22.758 { 00:25:22.758 "name": "BaseBdev4", 00:25:22.758 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:22.758 "is_configured": true, 00:25:22.758 "data_offset": 0, 00:25:22.758 "data_size": 65536 00:25:22.758 } 00:25:22.758 ] 00:25:22.758 }' 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:22.758 10:32:59 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:23.325 10:33:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:23.583 [2024-07-15 10:33:00.611606] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:23.583 [2024-07-15 10:33:00.615702] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x214d6b0 00:25:23.583 [2024-07-15 10:33:00.618088] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:23.583 10:33:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:24.517 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:24.517 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:24.517 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:24.517 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:24.517 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:24.517 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.517 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.776 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:24.776 "name": "raid_bdev1", 00:25:24.776 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:24.776 "strip_size_kb": 0, 00:25:24.776 "state": "online", 00:25:24.776 "raid_level": "raid1", 00:25:24.776 "superblock": false, 00:25:24.776 "num_base_bdevs": 4, 00:25:24.776 "num_base_bdevs_discovered": 4, 00:25:24.776 "num_base_bdevs_operational": 4, 00:25:24.776 "process": { 00:25:24.776 "type": "rebuild", 00:25:24.776 "target": "spare", 00:25:24.776 "progress": { 00:25:24.776 "blocks": 22528, 00:25:24.776 "percent": 34 00:25:24.776 } 00:25:24.776 }, 00:25:24.776 "base_bdevs_list": [ 00:25:24.776 { 00:25:24.776 "name": "spare", 00:25:24.776 "uuid": "4244e6ee-c6f9-5068-9301-f8b32879da0a", 00:25:24.776 "is_configured": true, 00:25:24.776 "data_offset": 0, 00:25:24.776 "data_size": 65536 00:25:24.776 }, 00:25:24.776 { 00:25:24.776 "name": "BaseBdev2", 00:25:24.776 "uuid": "3f563881-3be7-5f99-bd8e-413c6c88ad58", 00:25:24.776 "is_configured": true, 00:25:24.776 "data_offset": 0, 00:25:24.776 "data_size": 65536 00:25:24.776 }, 00:25:24.776 { 00:25:24.776 "name": "BaseBdev3", 00:25:24.776 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:24.776 "is_configured": true, 00:25:24.776 "data_offset": 0, 00:25:24.776 "data_size": 65536 00:25:24.776 }, 00:25:24.776 { 00:25:24.776 "name": "BaseBdev4", 00:25:24.776 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:24.776 "is_configured": true, 00:25:24.776 "data_offset": 0, 00:25:24.776 "data_size": 65536 00:25:24.776 } 00:25:24.776 ] 00:25:24.776 }' 00:25:24.776 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:24.776 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:24.776 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:24.776 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:24.776 10:33:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:25.033 [2024-07-15 10:33:02.125769] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:25.033 [2024-07-15 10:33:02.129760] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:25.033 [2024-07-15 10:33:02.129803] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:25.033 [2024-07-15 10:33:02.129820] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:25.033 [2024-07-15 10:33:02.129828] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.033 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.290 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.290 "name": "raid_bdev1", 00:25:25.290 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:25.290 "strip_size_kb": 0, 00:25:25.290 "state": "online", 00:25:25.290 "raid_level": "raid1", 00:25:25.290 "superblock": false, 00:25:25.290 "num_base_bdevs": 4, 00:25:25.290 "num_base_bdevs_discovered": 3, 00:25:25.290 "num_base_bdevs_operational": 3, 00:25:25.290 "base_bdevs_list": [ 00:25:25.290 { 00:25:25.290 "name": null, 00:25:25.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.290 "is_configured": false, 00:25:25.290 "data_offset": 0, 00:25:25.290 "data_size": 65536 00:25:25.290 }, 00:25:25.290 { 00:25:25.290 "name": "BaseBdev2", 00:25:25.290 "uuid": "3f563881-3be7-5f99-bd8e-413c6c88ad58", 00:25:25.290 "is_configured": true, 00:25:25.290 "data_offset": 0, 00:25:25.290 "data_size": 65536 00:25:25.290 }, 00:25:25.290 { 00:25:25.290 "name": "BaseBdev3", 00:25:25.290 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:25.290 "is_configured": true, 00:25:25.290 "data_offset": 0, 00:25:25.290 "data_size": 65536 00:25:25.290 }, 00:25:25.290 { 00:25:25.290 "name": "BaseBdev4", 00:25:25.290 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:25.290 "is_configured": true, 00:25:25.290 "data_offset": 0, 00:25:25.290 "data_size": 65536 00:25:25.290 } 00:25:25.290 ] 00:25:25.290 }' 00:25:25.290 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.290 10:33:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:25.853 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:25.853 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.853 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:25.853 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:25.853 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.853 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.853 10:33:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.110 10:33:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.110 "name": "raid_bdev1", 00:25:26.110 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:26.110 "strip_size_kb": 0, 00:25:26.110 "state": "online", 00:25:26.110 "raid_level": "raid1", 00:25:26.110 "superblock": false, 00:25:26.110 "num_base_bdevs": 4, 00:25:26.110 "num_base_bdevs_discovered": 3, 00:25:26.110 "num_base_bdevs_operational": 3, 00:25:26.110 "base_bdevs_list": [ 00:25:26.110 { 00:25:26.110 "name": null, 00:25:26.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.110 "is_configured": false, 00:25:26.110 "data_offset": 0, 00:25:26.110 "data_size": 65536 00:25:26.110 }, 00:25:26.110 { 00:25:26.111 "name": "BaseBdev2", 00:25:26.111 "uuid": "3f563881-3be7-5f99-bd8e-413c6c88ad58", 00:25:26.111 "is_configured": true, 00:25:26.111 "data_offset": 0, 00:25:26.111 "data_size": 65536 00:25:26.111 }, 00:25:26.111 { 00:25:26.111 "name": "BaseBdev3", 00:25:26.111 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:26.111 "is_configured": true, 00:25:26.111 "data_offset": 0, 00:25:26.111 "data_size": 65536 00:25:26.111 }, 00:25:26.111 { 00:25:26.111 "name": "BaseBdev4", 00:25:26.111 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:26.111 "is_configured": true, 00:25:26.111 "data_offset": 0, 00:25:26.111 "data_size": 65536 00:25:26.111 } 00:25:26.111 ] 00:25:26.111 }' 00:25:26.111 10:33:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.111 10:33:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:26.111 10:33:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.368 10:33:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:26.368 10:33:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:26.368 [2024-07-15 10:33:03.534201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:26.368 [2024-07-15 10:33:03.538280] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x214d6b0 00:25:26.368 [2024-07-15 10:33:03.539780] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:26.368 10:33:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:27.738 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:27.738 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.738 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:27.738 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:27.738 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.739 "name": "raid_bdev1", 00:25:27.739 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:27.739 "strip_size_kb": 0, 00:25:27.739 "state": "online", 00:25:27.739 "raid_level": "raid1", 00:25:27.739 "superblock": false, 00:25:27.739 "num_base_bdevs": 4, 00:25:27.739 "num_base_bdevs_discovered": 4, 00:25:27.739 "num_base_bdevs_operational": 4, 00:25:27.739 "process": { 00:25:27.739 "type": "rebuild", 00:25:27.739 "target": "spare", 00:25:27.739 "progress": { 00:25:27.739 "blocks": 24576, 00:25:27.739 "percent": 37 00:25:27.739 } 00:25:27.739 }, 00:25:27.739 "base_bdevs_list": [ 00:25:27.739 { 00:25:27.739 "name": "spare", 00:25:27.739 "uuid": "4244e6ee-c6f9-5068-9301-f8b32879da0a", 00:25:27.739 "is_configured": true, 00:25:27.739 "data_offset": 0, 00:25:27.739 "data_size": 65536 00:25:27.739 }, 00:25:27.739 { 00:25:27.739 "name": "BaseBdev2", 00:25:27.739 "uuid": "3f563881-3be7-5f99-bd8e-413c6c88ad58", 00:25:27.739 "is_configured": true, 00:25:27.739 "data_offset": 0, 00:25:27.739 "data_size": 65536 00:25:27.739 }, 00:25:27.739 { 00:25:27.739 "name": "BaseBdev3", 00:25:27.739 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:27.739 "is_configured": true, 00:25:27.739 "data_offset": 0, 00:25:27.739 "data_size": 65536 00:25:27.739 }, 00:25:27.739 { 00:25:27.739 "name": "BaseBdev4", 00:25:27.739 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:27.739 "is_configured": true, 00:25:27.739 "data_offset": 0, 00:25:27.739 "data_size": 65536 00:25:27.739 } 00:25:27.739 ] 00:25:27.739 }' 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:27.739 10:33:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:27.996 [2024-07-15 10:33:05.127934] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:27.996 [2024-07-15 10:33:05.152481] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x214d6b0 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.996 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.254 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.254 "name": "raid_bdev1", 00:25:28.254 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:28.254 "strip_size_kb": 0, 00:25:28.254 "state": "online", 00:25:28.254 "raid_level": "raid1", 00:25:28.254 "superblock": false, 00:25:28.254 "num_base_bdevs": 4, 00:25:28.254 "num_base_bdevs_discovered": 3, 00:25:28.254 "num_base_bdevs_operational": 3, 00:25:28.254 "process": { 00:25:28.254 "type": "rebuild", 00:25:28.254 "target": "spare", 00:25:28.254 "progress": { 00:25:28.254 "blocks": 36864, 00:25:28.254 "percent": 56 00:25:28.254 } 00:25:28.254 }, 00:25:28.254 "base_bdevs_list": [ 00:25:28.254 { 00:25:28.254 "name": "spare", 00:25:28.254 "uuid": "4244e6ee-c6f9-5068-9301-f8b32879da0a", 00:25:28.254 "is_configured": true, 00:25:28.254 "data_offset": 0, 00:25:28.254 "data_size": 65536 00:25:28.254 }, 00:25:28.254 { 00:25:28.254 "name": null, 00:25:28.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.254 "is_configured": false, 00:25:28.254 "data_offset": 0, 00:25:28.254 "data_size": 65536 00:25:28.254 }, 00:25:28.254 { 00:25:28.254 "name": "BaseBdev3", 00:25:28.254 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:28.254 "is_configured": true, 00:25:28.254 "data_offset": 0, 00:25:28.254 "data_size": 65536 00:25:28.254 }, 00:25:28.254 { 00:25:28.254 "name": "BaseBdev4", 00:25:28.254 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:28.254 "is_configured": true, 00:25:28.254 "data_offset": 0, 00:25:28.254 "data_size": 65536 00:25:28.254 } 00:25:28.254 ] 00:25:28.254 }' 00:25:28.254 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:28.254 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:28.254 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=870 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:28.512 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.771 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:28.771 "name": "raid_bdev1", 00:25:28.771 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:28.771 "strip_size_kb": 0, 00:25:28.771 "state": "online", 00:25:28.771 "raid_level": "raid1", 00:25:28.771 "superblock": false, 00:25:28.771 "num_base_bdevs": 4, 00:25:28.771 "num_base_bdevs_discovered": 3, 00:25:28.771 "num_base_bdevs_operational": 3, 00:25:28.771 "process": { 00:25:28.771 "type": "rebuild", 00:25:28.771 "target": "spare", 00:25:28.771 "progress": { 00:25:28.771 "blocks": 43008, 00:25:28.771 "percent": 65 00:25:28.771 } 00:25:28.771 }, 00:25:28.771 "base_bdevs_list": [ 00:25:28.771 { 00:25:28.771 "name": "spare", 00:25:28.771 "uuid": "4244e6ee-c6f9-5068-9301-f8b32879da0a", 00:25:28.771 "is_configured": true, 00:25:28.771 "data_offset": 0, 00:25:28.771 "data_size": 65536 00:25:28.771 }, 00:25:28.771 { 00:25:28.771 "name": null, 00:25:28.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:28.771 "is_configured": false, 00:25:28.771 "data_offset": 0, 00:25:28.771 "data_size": 65536 00:25:28.771 }, 00:25:28.771 { 00:25:28.771 "name": "BaseBdev3", 00:25:28.771 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:28.771 "is_configured": true, 00:25:28.771 "data_offset": 0, 00:25:28.771 "data_size": 65536 00:25:28.771 }, 00:25:28.771 { 00:25:28.771 "name": "BaseBdev4", 00:25:28.771 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:28.771 "is_configured": true, 00:25:28.771 "data_offset": 0, 00:25:28.771 "data_size": 65536 00:25:28.771 } 00:25:28.771 ] 00:25:28.771 }' 00:25:28.771 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:28.771 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:28.771 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:28.771 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:28.771 10:33:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:29.705 [2024-07-15 10:33:06.765432] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:29.705 [2024-07-15 10:33:06.765497] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:29.705 [2024-07-15 10:33:06.765536] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.705 10:33:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.964 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.964 "name": "raid_bdev1", 00:25:29.964 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:29.964 "strip_size_kb": 0, 00:25:29.964 "state": "online", 00:25:29.964 "raid_level": "raid1", 00:25:29.964 "superblock": false, 00:25:29.964 "num_base_bdevs": 4, 00:25:29.964 "num_base_bdevs_discovered": 3, 00:25:29.964 "num_base_bdevs_operational": 3, 00:25:29.965 "base_bdevs_list": [ 00:25:29.965 { 00:25:29.965 "name": "spare", 00:25:29.965 "uuid": "4244e6ee-c6f9-5068-9301-f8b32879da0a", 00:25:29.965 "is_configured": true, 00:25:29.965 "data_offset": 0, 00:25:29.965 "data_size": 65536 00:25:29.965 }, 00:25:29.965 { 00:25:29.965 "name": null, 00:25:29.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.965 "is_configured": false, 00:25:29.965 "data_offset": 0, 00:25:29.965 "data_size": 65536 00:25:29.965 }, 00:25:29.965 { 00:25:29.965 "name": "BaseBdev3", 00:25:29.965 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:29.965 "is_configured": true, 00:25:29.965 "data_offset": 0, 00:25:29.965 "data_size": 65536 00:25:29.965 }, 00:25:29.965 { 00:25:29.965 "name": "BaseBdev4", 00:25:29.965 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:29.965 "is_configured": true, 00:25:29.965 "data_offset": 0, 00:25:29.965 "data_size": 65536 00:25:29.965 } 00:25:29.965 ] 00:25:29.965 }' 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.965 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.223 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.223 "name": "raid_bdev1", 00:25:30.223 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:30.223 "strip_size_kb": 0, 00:25:30.223 "state": "online", 00:25:30.223 "raid_level": "raid1", 00:25:30.223 "superblock": false, 00:25:30.223 "num_base_bdevs": 4, 00:25:30.223 "num_base_bdevs_discovered": 3, 00:25:30.223 "num_base_bdevs_operational": 3, 00:25:30.223 "base_bdevs_list": [ 00:25:30.223 { 00:25:30.223 "name": "spare", 00:25:30.223 "uuid": "4244e6ee-c6f9-5068-9301-f8b32879da0a", 00:25:30.223 "is_configured": true, 00:25:30.223 "data_offset": 0, 00:25:30.223 "data_size": 65536 00:25:30.223 }, 00:25:30.223 { 00:25:30.223 "name": null, 00:25:30.223 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.223 "is_configured": false, 00:25:30.223 "data_offset": 0, 00:25:30.223 "data_size": 65536 00:25:30.223 }, 00:25:30.223 { 00:25:30.223 "name": "BaseBdev3", 00:25:30.223 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:30.223 "is_configured": true, 00:25:30.223 "data_offset": 0, 00:25:30.223 "data_size": 65536 00:25:30.223 }, 00:25:30.223 { 00:25:30.223 "name": "BaseBdev4", 00:25:30.223 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:30.223 "is_configured": true, 00:25:30.223 "data_offset": 0, 00:25:30.223 "data_size": 65536 00:25:30.223 } 00:25:30.223 ] 00:25:30.223 }' 00:25:30.223 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:30.223 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:30.223 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.482 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.740 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.740 "name": "raid_bdev1", 00:25:30.740 "uuid": "6b33ffa4-aaa7-4613-a946-327a4a2691be", 00:25:30.740 "strip_size_kb": 0, 00:25:30.740 "state": "online", 00:25:30.740 "raid_level": "raid1", 00:25:30.740 "superblock": false, 00:25:30.740 "num_base_bdevs": 4, 00:25:30.740 "num_base_bdevs_discovered": 3, 00:25:30.740 "num_base_bdevs_operational": 3, 00:25:30.740 "base_bdevs_list": [ 00:25:30.740 { 00:25:30.740 "name": "spare", 00:25:30.740 "uuid": "4244e6ee-c6f9-5068-9301-f8b32879da0a", 00:25:30.740 "is_configured": true, 00:25:30.740 "data_offset": 0, 00:25:30.740 "data_size": 65536 00:25:30.740 }, 00:25:30.740 { 00:25:30.740 "name": null, 00:25:30.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.740 "is_configured": false, 00:25:30.740 "data_offset": 0, 00:25:30.740 "data_size": 65536 00:25:30.740 }, 00:25:30.740 { 00:25:30.740 "name": "BaseBdev3", 00:25:30.740 "uuid": "ddf13157-5e4f-5d67-8542-aec4d1f8ca96", 00:25:30.740 "is_configured": true, 00:25:30.740 "data_offset": 0, 00:25:30.740 "data_size": 65536 00:25:30.740 }, 00:25:30.740 { 00:25:30.740 "name": "BaseBdev4", 00:25:30.740 "uuid": "890363b1-c838-5fbd-a11f-f3bf2e231aaf", 00:25:30.740 "is_configured": true, 00:25:30.740 "data_offset": 0, 00:25:30.740 "data_size": 65536 00:25:30.740 } 00:25:30.740 ] 00:25:30.740 }' 00:25:30.740 10:33:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.740 10:33:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:31.305 10:33:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:31.563 [2024-07-15 10:33:08.550630] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:31.563 [2024-07-15 10:33:08.550657] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:31.563 [2024-07-15 10:33:08.550717] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:31.563 [2024-07-15 10:33:08.550787] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:31.563 [2024-07-15 10:33:08.550798] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21478a0 name raid_bdev1, state offline 00:25:31.563 10:33:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.563 10:33:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:31.821 10:33:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:32.079 /dev/nbd0 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:32.079 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:32.079 1+0 records in 00:25:32.079 1+0 records out 00:25:32.080 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245025 s, 16.7 MB/s 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:32.080 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:32.338 /dev/nbd1 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:32.338 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:32.338 1+0 records in 00:25:32.338 1+0 records out 00:25:32.338 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306921 s, 13.3 MB/s 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:32.339 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:32.597 10:33:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 595546 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 595546 ']' 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 595546 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:32.876 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 595546 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 595546' 00:25:33.171 killing process with pid 595546 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 595546 00:25:33.171 Received shutdown signal, test time was about 60.000000 seconds 00:25:33.171 00:25:33.171 Latency(us) 00:25:33.171 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:33.171 =================================================================================================================== 00:25:33.171 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:33.171 [2024-07-15 10:33:10.074991] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 595546 00:25:33.171 [2024-07-15 10:33:10.123275] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:25:33.171 00:25:33.171 real 0m24.966s 00:25:33.171 user 0m34.430s 00:25:33.171 sys 0m5.144s 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:33.171 10:33:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:25:33.171 ************************************ 00:25:33.171 END TEST raid_rebuild_test 00:25:33.171 ************************************ 00:25:33.429 10:33:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:33.429 10:33:10 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:25:33.429 10:33:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:33.429 10:33:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:33.429 10:33:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:33.429 ************************************ 00:25:33.429 START TEST raid_rebuild_test_sb 00:25:33.429 ************************************ 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=599566 00:25:33.429 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 599566 /var/tmp/spdk-raid.sock 00:25:33.430 10:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:33.430 10:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 599566 ']' 00:25:33.430 10:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:33.430 10:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:33.430 10:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:33.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:33.430 10:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:33.430 10:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:33.430 [2024-07-15 10:33:10.497392] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:25:33.430 [2024-07-15 10:33:10.497457] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid599566 ] 00:25:33.430 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:33.430 Zero copy mechanism will not be used. 00:25:33.430 [2024-07-15 10:33:10.616269] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:33.687 [2024-07-15 10:33:10.723241] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:33.687 [2024-07-15 10:33:10.790067] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:33.687 [2024-07-15 10:33:10.790107] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:34.253 10:33:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:34.253 10:33:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:25:34.253 10:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:34.253 10:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:34.511 BaseBdev1_malloc 00:25:34.511 10:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:34.769 [2024-07-15 10:33:11.891630] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:34.769 [2024-07-15 10:33:11.891676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:34.769 [2024-07-15 10:33:11.891700] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a90d40 00:25:34.769 [2024-07-15 10:33:11.891714] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:34.769 [2024-07-15 10:33:11.893441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:34.769 [2024-07-15 10:33:11.893474] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:34.769 BaseBdev1 00:25:34.769 10:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:34.769 10:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:35.027 BaseBdev2_malloc 00:25:35.027 10:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:35.285 [2024-07-15 10:33:12.377887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:35.285 [2024-07-15 10:33:12.377939] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:35.285 [2024-07-15 10:33:12.377963] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a91860 00:25:35.285 [2024-07-15 10:33:12.377976] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:35.285 [2024-07-15 10:33:12.379526] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:35.285 [2024-07-15 10:33:12.379554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:35.285 BaseBdev2 00:25:35.285 10:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:35.285 10:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:35.543 BaseBdev3_malloc 00:25:35.543 10:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:35.801 [2024-07-15 10:33:12.861110] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:35.801 [2024-07-15 10:33:12.861157] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:35.801 [2024-07-15 10:33:12.861179] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3e8f0 00:25:35.801 [2024-07-15 10:33:12.861192] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:35.801 [2024-07-15 10:33:12.862766] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:35.801 [2024-07-15 10:33:12.862793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:35.801 BaseBdev3 00:25:35.801 10:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:35.801 10:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:36.060 BaseBdev4_malloc 00:25:36.060 10:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:36.317 [2024-07-15 10:33:13.348109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:36.317 [2024-07-15 10:33:13.348156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:36.317 [2024-07-15 10:33:13.348177] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3dad0 00:25:36.317 [2024-07-15 10:33:13.348190] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:36.317 [2024-07-15 10:33:13.349754] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:36.317 [2024-07-15 10:33:13.349780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:36.317 BaseBdev4 00:25:36.317 10:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:36.574 spare_malloc 00:25:36.574 10:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:36.831 spare_delay 00:25:36.831 10:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:37.088 [2024-07-15 10:33:14.066588] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:37.088 [2024-07-15 10:33:14.066633] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.088 [2024-07-15 10:33:14.066653] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c425b0 00:25:37.088 [2024-07-15 10:33:14.066666] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.088 [2024-07-15 10:33:14.068219] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.088 [2024-07-15 10:33:14.068247] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:37.088 spare 00:25:37.088 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:37.346 [2024-07-15 10:33:14.303248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:37.346 [2024-07-15 10:33:14.304561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:37.346 [2024-07-15 10:33:14.304617] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:37.346 [2024-07-15 10:33:14.304662] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:37.346 [2024-07-15 10:33:14.304858] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bc18a0 00:25:37.346 [2024-07-15 10:33:14.304869] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:37.346 [2024-07-15 10:33:14.305080] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3be10 00:25:37.346 [2024-07-15 10:33:14.305234] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bc18a0 00:25:37.346 [2024-07-15 10:33:14.305244] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bc18a0 00:25:37.346 [2024-07-15 10:33:14.305344] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.346 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.912 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:37.912 "name": "raid_bdev1", 00:25:37.912 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:37.912 "strip_size_kb": 0, 00:25:37.912 "state": "online", 00:25:37.912 "raid_level": "raid1", 00:25:37.912 "superblock": true, 00:25:37.912 "num_base_bdevs": 4, 00:25:37.912 "num_base_bdevs_discovered": 4, 00:25:37.912 "num_base_bdevs_operational": 4, 00:25:37.912 "base_bdevs_list": [ 00:25:37.912 { 00:25:37.912 "name": "BaseBdev1", 00:25:37.912 "uuid": "b69ec91e-f44d-579f-afcb-01f9f88849fa", 00:25:37.912 "is_configured": true, 00:25:37.912 "data_offset": 2048, 00:25:37.912 "data_size": 63488 00:25:37.912 }, 00:25:37.912 { 00:25:37.912 "name": "BaseBdev2", 00:25:37.912 "uuid": "931ede2e-0e95-5028-860f-546dc314dcf9", 00:25:37.912 "is_configured": true, 00:25:37.912 "data_offset": 2048, 00:25:37.912 "data_size": 63488 00:25:37.912 }, 00:25:37.912 { 00:25:37.912 "name": "BaseBdev3", 00:25:37.912 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:37.912 "is_configured": true, 00:25:37.912 "data_offset": 2048, 00:25:37.912 "data_size": 63488 00:25:37.912 }, 00:25:37.912 { 00:25:37.912 "name": "BaseBdev4", 00:25:37.912 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:37.912 "is_configured": true, 00:25:37.912 "data_offset": 2048, 00:25:37.912 "data_size": 63488 00:25:37.912 } 00:25:37.912 ] 00:25:37.912 }' 00:25:37.912 10:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:37.912 10:33:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:38.477 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:38.477 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:38.477 [2024-07-15 10:33:15.667159] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:38.734 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:38.734 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.734 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:38.992 10:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:38.992 [2024-07-15 10:33:16.168222] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3be10 00:25:38.992 /dev/nbd0 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:39.250 1+0 records in 00:25:39.250 1+0 records out 00:25:39.250 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268296 s, 15.3 MB/s 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:39.250 10:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:47.354 63488+0 records in 00:25:47.354 63488+0 records out 00:25:47.354 32505856 bytes (33 MB, 31 MiB) copied, 6.96671 s, 4.7 MB/s 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:47.354 [2024-07-15 10:33:23.481699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:47.354 [2024-07-15 10:33:23.642170] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.354 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:47.354 "name": "raid_bdev1", 00:25:47.354 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:47.354 "strip_size_kb": 0, 00:25:47.354 "state": "online", 00:25:47.354 "raid_level": "raid1", 00:25:47.354 "superblock": true, 00:25:47.354 "num_base_bdevs": 4, 00:25:47.354 "num_base_bdevs_discovered": 3, 00:25:47.354 "num_base_bdevs_operational": 3, 00:25:47.355 "base_bdevs_list": [ 00:25:47.355 { 00:25:47.355 "name": null, 00:25:47.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:47.355 "is_configured": false, 00:25:47.355 "data_offset": 2048, 00:25:47.355 "data_size": 63488 00:25:47.355 }, 00:25:47.355 { 00:25:47.355 "name": "BaseBdev2", 00:25:47.355 "uuid": "931ede2e-0e95-5028-860f-546dc314dcf9", 00:25:47.355 "is_configured": true, 00:25:47.355 "data_offset": 2048, 00:25:47.355 "data_size": 63488 00:25:47.355 }, 00:25:47.355 { 00:25:47.355 "name": "BaseBdev3", 00:25:47.355 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:47.355 "is_configured": true, 00:25:47.355 "data_offset": 2048, 00:25:47.355 "data_size": 63488 00:25:47.355 }, 00:25:47.355 { 00:25:47.355 "name": "BaseBdev4", 00:25:47.355 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:47.355 "is_configured": true, 00:25:47.355 "data_offset": 2048, 00:25:47.355 "data_size": 63488 00:25:47.355 } 00:25:47.355 ] 00:25:47.355 }' 00:25:47.355 10:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:47.355 10:33:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:47.355 10:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:47.613 [2024-07-15 10:33:24.725052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:47.613 [2024-07-15 10:33:24.729097] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3be10 00:25:47.613 [2024-07-15 10:33:24.731453] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:47.613 10:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.985 "name": "raid_bdev1", 00:25:48.985 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:48.985 "strip_size_kb": 0, 00:25:48.985 "state": "online", 00:25:48.985 "raid_level": "raid1", 00:25:48.985 "superblock": true, 00:25:48.985 "num_base_bdevs": 4, 00:25:48.985 "num_base_bdevs_discovered": 4, 00:25:48.985 "num_base_bdevs_operational": 4, 00:25:48.985 "process": { 00:25:48.985 "type": "rebuild", 00:25:48.985 "target": "spare", 00:25:48.985 "progress": { 00:25:48.985 "blocks": 22528, 00:25:48.985 "percent": 35 00:25:48.985 } 00:25:48.985 }, 00:25:48.985 "base_bdevs_list": [ 00:25:48.985 { 00:25:48.985 "name": "spare", 00:25:48.985 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:48.985 "is_configured": true, 00:25:48.985 "data_offset": 2048, 00:25:48.985 "data_size": 63488 00:25:48.985 }, 00:25:48.985 { 00:25:48.985 "name": "BaseBdev2", 00:25:48.985 "uuid": "931ede2e-0e95-5028-860f-546dc314dcf9", 00:25:48.985 "is_configured": true, 00:25:48.985 "data_offset": 2048, 00:25:48.985 "data_size": 63488 00:25:48.985 }, 00:25:48.985 { 00:25:48.985 "name": "BaseBdev3", 00:25:48.985 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:48.985 "is_configured": true, 00:25:48.985 "data_offset": 2048, 00:25:48.985 "data_size": 63488 00:25:48.985 }, 00:25:48.985 { 00:25:48.985 "name": "BaseBdev4", 00:25:48.985 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:48.985 "is_configured": true, 00:25:48.985 "data_offset": 2048, 00:25:48.985 "data_size": 63488 00:25:48.985 } 00:25:48.985 ] 00:25:48.985 }' 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:48.985 10:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.985 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:48.985 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:49.242 [2024-07-15 10:33:26.247007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:49.242 [2024-07-15 10:33:26.343779] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:49.242 [2024-07-15 10:33:26.343826] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:49.242 [2024-07-15 10:33:26.343843] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:49.242 [2024-07-15 10:33:26.343852] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:49.242 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:49.242 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:49.242 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:49.242 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:49.242 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:49.242 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:49.242 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:49.243 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:49.243 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:49.243 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:49.243 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.243 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.500 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:49.500 "name": "raid_bdev1", 00:25:49.500 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:49.500 "strip_size_kb": 0, 00:25:49.500 "state": "online", 00:25:49.500 "raid_level": "raid1", 00:25:49.500 "superblock": true, 00:25:49.500 "num_base_bdevs": 4, 00:25:49.500 "num_base_bdevs_discovered": 3, 00:25:49.500 "num_base_bdevs_operational": 3, 00:25:49.500 "base_bdevs_list": [ 00:25:49.500 { 00:25:49.500 "name": null, 00:25:49.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.500 "is_configured": false, 00:25:49.500 "data_offset": 2048, 00:25:49.500 "data_size": 63488 00:25:49.500 }, 00:25:49.500 { 00:25:49.500 "name": "BaseBdev2", 00:25:49.500 "uuid": "931ede2e-0e95-5028-860f-546dc314dcf9", 00:25:49.500 "is_configured": true, 00:25:49.500 "data_offset": 2048, 00:25:49.500 "data_size": 63488 00:25:49.500 }, 00:25:49.500 { 00:25:49.500 "name": "BaseBdev3", 00:25:49.500 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:49.500 "is_configured": true, 00:25:49.500 "data_offset": 2048, 00:25:49.500 "data_size": 63488 00:25:49.500 }, 00:25:49.500 { 00:25:49.500 "name": "BaseBdev4", 00:25:49.500 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:49.500 "is_configured": true, 00:25:49.500 "data_offset": 2048, 00:25:49.500 "data_size": 63488 00:25:49.500 } 00:25:49.500 ] 00:25:49.500 }' 00:25:49.500 10:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:49.500 10:33:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:50.065 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:50.065 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:50.065 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:50.065 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:50.065 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:50.065 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.065 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.323 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.323 "name": "raid_bdev1", 00:25:50.323 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:50.323 "strip_size_kb": 0, 00:25:50.323 "state": "online", 00:25:50.323 "raid_level": "raid1", 00:25:50.323 "superblock": true, 00:25:50.323 "num_base_bdevs": 4, 00:25:50.323 "num_base_bdevs_discovered": 3, 00:25:50.323 "num_base_bdevs_operational": 3, 00:25:50.323 "base_bdevs_list": [ 00:25:50.323 { 00:25:50.323 "name": null, 00:25:50.323 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:50.323 "is_configured": false, 00:25:50.323 "data_offset": 2048, 00:25:50.323 "data_size": 63488 00:25:50.323 }, 00:25:50.323 { 00:25:50.323 "name": "BaseBdev2", 00:25:50.323 "uuid": "931ede2e-0e95-5028-860f-546dc314dcf9", 00:25:50.323 "is_configured": true, 00:25:50.323 "data_offset": 2048, 00:25:50.323 "data_size": 63488 00:25:50.323 }, 00:25:50.323 { 00:25:50.323 "name": "BaseBdev3", 00:25:50.323 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:50.323 "is_configured": true, 00:25:50.323 "data_offset": 2048, 00:25:50.323 "data_size": 63488 00:25:50.323 }, 00:25:50.323 { 00:25:50.323 "name": "BaseBdev4", 00:25:50.323 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:50.323 "is_configured": true, 00:25:50.323 "data_offset": 2048, 00:25:50.323 "data_size": 63488 00:25:50.323 } 00:25:50.323 ] 00:25:50.323 }' 00:25:50.324 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.324 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:50.324 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.582 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:50.582 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:50.582 [2024-07-15 10:33:27.708029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:50.582 [2024-07-15 10:33:27.712104] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1797fa0 00:25:50.582 [2024-07-15 10:33:27.713599] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:50.582 10:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:51.955 "name": "raid_bdev1", 00:25:51.955 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:51.955 "strip_size_kb": 0, 00:25:51.955 "state": "online", 00:25:51.955 "raid_level": "raid1", 00:25:51.955 "superblock": true, 00:25:51.955 "num_base_bdevs": 4, 00:25:51.955 "num_base_bdevs_discovered": 4, 00:25:51.955 "num_base_bdevs_operational": 4, 00:25:51.955 "process": { 00:25:51.955 "type": "rebuild", 00:25:51.955 "target": "spare", 00:25:51.955 "progress": { 00:25:51.955 "blocks": 22528, 00:25:51.955 "percent": 35 00:25:51.955 } 00:25:51.955 }, 00:25:51.955 "base_bdevs_list": [ 00:25:51.955 { 00:25:51.955 "name": "spare", 00:25:51.955 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:51.955 "is_configured": true, 00:25:51.955 "data_offset": 2048, 00:25:51.955 "data_size": 63488 00:25:51.955 }, 00:25:51.955 { 00:25:51.955 "name": "BaseBdev2", 00:25:51.955 "uuid": "931ede2e-0e95-5028-860f-546dc314dcf9", 00:25:51.955 "is_configured": true, 00:25:51.955 "data_offset": 2048, 00:25:51.955 "data_size": 63488 00:25:51.955 }, 00:25:51.955 { 00:25:51.955 "name": "BaseBdev3", 00:25:51.955 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:51.955 "is_configured": true, 00:25:51.955 "data_offset": 2048, 00:25:51.955 "data_size": 63488 00:25:51.955 }, 00:25:51.955 { 00:25:51.955 "name": "BaseBdev4", 00:25:51.955 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:51.955 "is_configured": true, 00:25:51.955 "data_offset": 2048, 00:25:51.955 "data_size": 63488 00:25:51.955 } 00:25:51.955 ] 00:25:51.955 }' 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:51.955 10:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:51.955 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:51.955 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:51.955 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:51.955 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:51.955 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:51.955 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:51.955 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:51.955 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:52.213 [2024-07-15 10:33:29.241147] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:52.471 [2024-07-15 10:33:29.426664] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1797fa0 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.471 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.738 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:52.738 "name": "raid_bdev1", 00:25:52.738 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:52.738 "strip_size_kb": 0, 00:25:52.738 "state": "online", 00:25:52.738 "raid_level": "raid1", 00:25:52.738 "superblock": true, 00:25:52.738 "num_base_bdevs": 4, 00:25:52.738 "num_base_bdevs_discovered": 3, 00:25:52.738 "num_base_bdevs_operational": 3, 00:25:52.738 "process": { 00:25:52.738 "type": "rebuild", 00:25:52.738 "target": "spare", 00:25:52.738 "progress": { 00:25:52.738 "blocks": 36864, 00:25:52.738 "percent": 58 00:25:52.738 } 00:25:52.738 }, 00:25:52.738 "base_bdevs_list": [ 00:25:52.738 { 00:25:52.738 "name": "spare", 00:25:52.738 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:52.738 "is_configured": true, 00:25:52.738 "data_offset": 2048, 00:25:52.738 "data_size": 63488 00:25:52.738 }, 00:25:52.738 { 00:25:52.738 "name": null, 00:25:52.738 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.738 "is_configured": false, 00:25:52.738 "data_offset": 2048, 00:25:52.738 "data_size": 63488 00:25:52.738 }, 00:25:52.738 { 00:25:52.738 "name": "BaseBdev3", 00:25:52.738 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:52.738 "is_configured": true, 00:25:52.739 "data_offset": 2048, 00:25:52.739 "data_size": 63488 00:25:52.739 }, 00:25:52.739 { 00:25:52.739 "name": "BaseBdev4", 00:25:52.739 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:52.739 "is_configured": true, 00:25:52.739 "data_offset": 2048, 00:25:52.739 "data_size": 63488 00:25:52.739 } 00:25:52.739 ] 00:25:52.739 }' 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=894 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.739 10:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.017 10:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:53.017 "name": "raid_bdev1", 00:25:53.017 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:53.017 "strip_size_kb": 0, 00:25:53.017 "state": "online", 00:25:53.017 "raid_level": "raid1", 00:25:53.017 "superblock": true, 00:25:53.017 "num_base_bdevs": 4, 00:25:53.017 "num_base_bdevs_discovered": 3, 00:25:53.017 "num_base_bdevs_operational": 3, 00:25:53.017 "process": { 00:25:53.017 "type": "rebuild", 00:25:53.017 "target": "spare", 00:25:53.017 "progress": { 00:25:53.017 "blocks": 43008, 00:25:53.017 "percent": 67 00:25:53.017 } 00:25:53.017 }, 00:25:53.017 "base_bdevs_list": [ 00:25:53.017 { 00:25:53.017 "name": "spare", 00:25:53.017 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:53.017 "is_configured": true, 00:25:53.017 "data_offset": 2048, 00:25:53.017 "data_size": 63488 00:25:53.017 }, 00:25:53.017 { 00:25:53.017 "name": null, 00:25:53.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:53.017 "is_configured": false, 00:25:53.017 "data_offset": 2048, 00:25:53.017 "data_size": 63488 00:25:53.017 }, 00:25:53.017 { 00:25:53.017 "name": "BaseBdev3", 00:25:53.017 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:53.017 "is_configured": true, 00:25:53.017 "data_offset": 2048, 00:25:53.017 "data_size": 63488 00:25:53.017 }, 00:25:53.017 { 00:25:53.017 "name": "BaseBdev4", 00:25:53.017 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:53.017 "is_configured": true, 00:25:53.017 "data_offset": 2048, 00:25:53.017 "data_size": 63488 00:25:53.017 } 00:25:53.017 ] 00:25:53.017 }' 00:25:53.017 10:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:53.017 10:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:53.017 10:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:53.017 10:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:53.017 10:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:53.950 [2024-07-15 10:33:30.938980] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:53.950 [2024-07-15 10:33:30.939045] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:53.950 [2024-07-15 10:33:30.939143] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.208 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.208 "name": "raid_bdev1", 00:25:54.208 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:54.208 "strip_size_kb": 0, 00:25:54.208 "state": "online", 00:25:54.208 "raid_level": "raid1", 00:25:54.208 "superblock": true, 00:25:54.208 "num_base_bdevs": 4, 00:25:54.208 "num_base_bdevs_discovered": 3, 00:25:54.208 "num_base_bdevs_operational": 3, 00:25:54.208 "base_bdevs_list": [ 00:25:54.208 { 00:25:54.208 "name": "spare", 00:25:54.208 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:54.208 "is_configured": true, 00:25:54.208 "data_offset": 2048, 00:25:54.208 "data_size": 63488 00:25:54.208 }, 00:25:54.208 { 00:25:54.208 "name": null, 00:25:54.209 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.209 "is_configured": false, 00:25:54.209 "data_offset": 2048, 00:25:54.209 "data_size": 63488 00:25:54.209 }, 00:25:54.209 { 00:25:54.209 "name": "BaseBdev3", 00:25:54.209 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:54.209 "is_configured": true, 00:25:54.209 "data_offset": 2048, 00:25:54.209 "data_size": 63488 00:25:54.209 }, 00:25:54.209 { 00:25:54.209 "name": "BaseBdev4", 00:25:54.209 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:54.209 "is_configured": true, 00:25:54.209 "data_offset": 2048, 00:25:54.209 "data_size": 63488 00:25:54.209 } 00:25:54.209 ] 00:25:54.209 }' 00:25:54.209 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.467 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.725 "name": "raid_bdev1", 00:25:54.725 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:54.725 "strip_size_kb": 0, 00:25:54.725 "state": "online", 00:25:54.725 "raid_level": "raid1", 00:25:54.725 "superblock": true, 00:25:54.725 "num_base_bdevs": 4, 00:25:54.725 "num_base_bdevs_discovered": 3, 00:25:54.725 "num_base_bdevs_operational": 3, 00:25:54.725 "base_bdevs_list": [ 00:25:54.725 { 00:25:54.725 "name": "spare", 00:25:54.725 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:54.725 "is_configured": true, 00:25:54.725 "data_offset": 2048, 00:25:54.725 "data_size": 63488 00:25:54.725 }, 00:25:54.725 { 00:25:54.725 "name": null, 00:25:54.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.725 "is_configured": false, 00:25:54.725 "data_offset": 2048, 00:25:54.725 "data_size": 63488 00:25:54.725 }, 00:25:54.725 { 00:25:54.725 "name": "BaseBdev3", 00:25:54.725 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:54.725 "is_configured": true, 00:25:54.725 "data_offset": 2048, 00:25:54.725 "data_size": 63488 00:25:54.725 }, 00:25:54.725 { 00:25:54.725 "name": "BaseBdev4", 00:25:54.725 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:54.725 "is_configured": true, 00:25:54.725 "data_offset": 2048, 00:25:54.725 "data_size": 63488 00:25:54.725 } 00:25:54.725 ] 00:25:54.725 }' 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.725 10:33:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.984 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:54.984 "name": "raid_bdev1", 00:25:54.984 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:54.984 "strip_size_kb": 0, 00:25:54.984 "state": "online", 00:25:54.984 "raid_level": "raid1", 00:25:54.984 "superblock": true, 00:25:54.984 "num_base_bdevs": 4, 00:25:54.984 "num_base_bdevs_discovered": 3, 00:25:54.984 "num_base_bdevs_operational": 3, 00:25:54.984 "base_bdevs_list": [ 00:25:54.984 { 00:25:54.984 "name": "spare", 00:25:54.984 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:54.984 "is_configured": true, 00:25:54.984 "data_offset": 2048, 00:25:54.984 "data_size": 63488 00:25:54.984 }, 00:25:54.984 { 00:25:54.984 "name": null, 00:25:54.984 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.984 "is_configured": false, 00:25:54.984 "data_offset": 2048, 00:25:54.984 "data_size": 63488 00:25:54.984 }, 00:25:54.984 { 00:25:54.984 "name": "BaseBdev3", 00:25:54.984 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:54.984 "is_configured": true, 00:25:54.984 "data_offset": 2048, 00:25:54.984 "data_size": 63488 00:25:54.984 }, 00:25:54.984 { 00:25:54.984 "name": "BaseBdev4", 00:25:54.984 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:54.984 "is_configured": true, 00:25:54.984 "data_offset": 2048, 00:25:54.984 "data_size": 63488 00:25:54.984 } 00:25:54.984 ] 00:25:54.984 }' 00:25:54.984 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:54.984 10:33:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:55.551 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:55.811 [2024-07-15 10:33:32.792387] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:55.811 [2024-07-15 10:33:32.792415] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:55.811 [2024-07-15 10:33:32.792474] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:55.811 [2024-07-15 10:33:32.792545] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:55.811 [2024-07-15 10:33:32.792557] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bc18a0 name raid_bdev1, state offline 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:55.811 10:33:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:56.070 /dev/nbd0 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:56.070 1+0 records in 00:25:56.070 1+0 records out 00:25:56.070 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241771 s, 16.9 MB/s 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:56.070 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:56.327 /dev/nbd1 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:56.584 1+0 records in 00:25:56.584 1+0 records out 00:25:56.584 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028983 s, 14.1 MB/s 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:56.584 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:56.585 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:56.842 10:33:33 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:56.842 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:56.842 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:56.842 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:56.842 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:56.842 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:56.842 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:56.842 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:25:57.100 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:25:57.100 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:57.100 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:57.100 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:57.100 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:57.100 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:57.100 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:57.357 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:57.357 [2024-07-15 10:33:34.546194] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:57.357 [2024-07-15 10:33:34.546237] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:57.357 [2024-07-15 10:33:34.546258] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc6930 00:25:57.357 [2024-07-15 10:33:34.546270] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:57.357 [2024-07-15 10:33:34.547887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:57.357 [2024-07-15 10:33:34.547915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:57.357 [2024-07-15 10:33:34.547996] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:57.357 [2024-07-15 10:33:34.548024] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:57.357 [2024-07-15 10:33:34.548128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:57.357 [2024-07-15 10:33:34.548201] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:57.357 spare 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.615 [2024-07-15 10:33:34.648519] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bc2970 00:25:57.615 [2024-07-15 10:33:34.648536] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:57.615 [2024-07-15 10:33:34.648734] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bc30b0 00:25:57.615 [2024-07-15 10:33:34.648883] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bc2970 00:25:57.615 [2024-07-15 10:33:34.648893] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bc2970 00:25:57.615 [2024-07-15 10:33:34.649001] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:57.615 "name": "raid_bdev1", 00:25:57.615 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:57.615 "strip_size_kb": 0, 00:25:57.615 "state": "online", 00:25:57.615 "raid_level": "raid1", 00:25:57.615 "superblock": true, 00:25:57.615 "num_base_bdevs": 4, 00:25:57.615 "num_base_bdevs_discovered": 3, 00:25:57.615 "num_base_bdevs_operational": 3, 00:25:57.615 "base_bdevs_list": [ 00:25:57.615 { 00:25:57.615 "name": "spare", 00:25:57.615 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:57.615 "is_configured": true, 00:25:57.615 "data_offset": 2048, 00:25:57.615 "data_size": 63488 00:25:57.615 }, 00:25:57.615 { 00:25:57.615 "name": null, 00:25:57.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.615 "is_configured": false, 00:25:57.615 "data_offset": 2048, 00:25:57.615 "data_size": 63488 00:25:57.615 }, 00:25:57.615 { 00:25:57.615 "name": "BaseBdev3", 00:25:57.615 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:57.615 "is_configured": true, 00:25:57.615 "data_offset": 2048, 00:25:57.615 "data_size": 63488 00:25:57.615 }, 00:25:57.615 { 00:25:57.615 "name": "BaseBdev4", 00:25:57.615 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:57.615 "is_configured": true, 00:25:57.615 "data_offset": 2048, 00:25:57.615 "data_size": 63488 00:25:57.615 } 00:25:57.615 ] 00:25:57.615 }' 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:57.615 10:33:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:58.181 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:58.181 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.181 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:58.181 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:58.181 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.181 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.181 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.439 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.439 "name": "raid_bdev1", 00:25:58.439 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:58.439 "strip_size_kb": 0, 00:25:58.439 "state": "online", 00:25:58.439 "raid_level": "raid1", 00:25:58.439 "superblock": true, 00:25:58.439 "num_base_bdevs": 4, 00:25:58.439 "num_base_bdevs_discovered": 3, 00:25:58.439 "num_base_bdevs_operational": 3, 00:25:58.439 "base_bdevs_list": [ 00:25:58.439 { 00:25:58.439 "name": "spare", 00:25:58.439 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:25:58.439 "is_configured": true, 00:25:58.439 "data_offset": 2048, 00:25:58.439 "data_size": 63488 00:25:58.439 }, 00:25:58.439 { 00:25:58.439 "name": null, 00:25:58.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.439 "is_configured": false, 00:25:58.439 "data_offset": 2048, 00:25:58.439 "data_size": 63488 00:25:58.439 }, 00:25:58.439 { 00:25:58.439 "name": "BaseBdev3", 00:25:58.439 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:58.439 "is_configured": true, 00:25:58.439 "data_offset": 2048, 00:25:58.439 "data_size": 63488 00:25:58.439 }, 00:25:58.439 { 00:25:58.439 "name": "BaseBdev4", 00:25:58.439 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:58.439 "is_configured": true, 00:25:58.439 "data_offset": 2048, 00:25:58.439 "data_size": 63488 00:25:58.439 } 00:25:58.440 ] 00:25:58.440 }' 00:25:58.440 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.440 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:58.440 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.440 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:58.440 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.440 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:58.698 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.698 10:33:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:58.956 [2024-07-15 10:33:36.098409] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.956 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.215 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.215 "name": "raid_bdev1", 00:25:59.215 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:25:59.215 "strip_size_kb": 0, 00:25:59.215 "state": "online", 00:25:59.215 "raid_level": "raid1", 00:25:59.215 "superblock": true, 00:25:59.215 "num_base_bdevs": 4, 00:25:59.215 "num_base_bdevs_discovered": 2, 00:25:59.215 "num_base_bdevs_operational": 2, 00:25:59.215 "base_bdevs_list": [ 00:25:59.215 { 00:25:59.215 "name": null, 00:25:59.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.215 "is_configured": false, 00:25:59.215 "data_offset": 2048, 00:25:59.215 "data_size": 63488 00:25:59.215 }, 00:25:59.215 { 00:25:59.215 "name": null, 00:25:59.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.215 "is_configured": false, 00:25:59.215 "data_offset": 2048, 00:25:59.215 "data_size": 63488 00:25:59.215 }, 00:25:59.215 { 00:25:59.215 "name": "BaseBdev3", 00:25:59.215 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:25:59.215 "is_configured": true, 00:25:59.215 "data_offset": 2048, 00:25:59.215 "data_size": 63488 00:25:59.215 }, 00:25:59.215 { 00:25:59.215 "name": "BaseBdev4", 00:25:59.215 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:25:59.215 "is_configured": true, 00:25:59.215 "data_offset": 2048, 00:25:59.215 "data_size": 63488 00:25:59.215 } 00:25:59.215 ] 00:25:59.215 }' 00:25:59.215 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.215 10:33:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:59.781 10:33:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:00.039 [2024-07-15 10:33:37.177293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:00.039 [2024-07-15 10:33:37.177439] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:00.039 [2024-07-15 10:33:37.177457] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:00.039 [2024-07-15 10:33:37.177486] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:00.039 [2024-07-15 10:33:37.181404] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bc7c70 00:26:00.039 [2024-07-15 10:33:37.183737] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:00.039 10:33:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.412 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:01.412 "name": "raid_bdev1", 00:26:01.412 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:01.412 "strip_size_kb": 0, 00:26:01.412 "state": "online", 00:26:01.412 "raid_level": "raid1", 00:26:01.412 "superblock": true, 00:26:01.412 "num_base_bdevs": 4, 00:26:01.412 "num_base_bdevs_discovered": 3, 00:26:01.412 "num_base_bdevs_operational": 3, 00:26:01.412 "process": { 00:26:01.412 "type": "rebuild", 00:26:01.412 "target": "spare", 00:26:01.412 "progress": { 00:26:01.412 "blocks": 24576, 00:26:01.412 "percent": 38 00:26:01.412 } 00:26:01.412 }, 00:26:01.412 "base_bdevs_list": [ 00:26:01.412 { 00:26:01.412 "name": "spare", 00:26:01.412 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:26:01.412 "is_configured": true, 00:26:01.412 "data_offset": 2048, 00:26:01.412 "data_size": 63488 00:26:01.412 }, 00:26:01.412 { 00:26:01.412 "name": null, 00:26:01.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.412 "is_configured": false, 00:26:01.412 "data_offset": 2048, 00:26:01.412 "data_size": 63488 00:26:01.412 }, 00:26:01.412 { 00:26:01.412 "name": "BaseBdev3", 00:26:01.412 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:01.412 "is_configured": true, 00:26:01.412 "data_offset": 2048, 00:26:01.412 "data_size": 63488 00:26:01.412 }, 00:26:01.412 { 00:26:01.412 "name": "BaseBdev4", 00:26:01.412 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:01.412 "is_configured": true, 00:26:01.413 "data_offset": 2048, 00:26:01.413 "data_size": 63488 00:26:01.413 } 00:26:01.413 ] 00:26:01.413 }' 00:26:01.413 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:01.413 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:01.413 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:01.413 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:01.413 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:01.671 [2024-07-15 10:33:38.758653] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:01.671 [2024-07-15 10:33:38.796040] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:01.671 [2024-07-15 10:33:38.796081] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:01.671 [2024-07-15 10:33:38.796098] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:01.671 [2024-07-15 10:33:38.796106] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.671 10:33:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.929 10:33:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.929 "name": "raid_bdev1", 00:26:01.929 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:01.929 "strip_size_kb": 0, 00:26:01.929 "state": "online", 00:26:01.929 "raid_level": "raid1", 00:26:01.929 "superblock": true, 00:26:01.929 "num_base_bdevs": 4, 00:26:01.929 "num_base_bdevs_discovered": 2, 00:26:01.929 "num_base_bdevs_operational": 2, 00:26:01.929 "base_bdevs_list": [ 00:26:01.929 { 00:26:01.929 "name": null, 00:26:01.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.929 "is_configured": false, 00:26:01.929 "data_offset": 2048, 00:26:01.929 "data_size": 63488 00:26:01.929 }, 00:26:01.929 { 00:26:01.929 "name": null, 00:26:01.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.929 "is_configured": false, 00:26:01.929 "data_offset": 2048, 00:26:01.929 "data_size": 63488 00:26:01.929 }, 00:26:01.929 { 00:26:01.929 "name": "BaseBdev3", 00:26:01.929 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:01.929 "is_configured": true, 00:26:01.929 "data_offset": 2048, 00:26:01.929 "data_size": 63488 00:26:01.929 }, 00:26:01.929 { 00:26:01.929 "name": "BaseBdev4", 00:26:01.929 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:01.929 "is_configured": true, 00:26:01.929 "data_offset": 2048, 00:26:01.929 "data_size": 63488 00:26:01.929 } 00:26:01.929 ] 00:26:01.929 }' 00:26:01.929 10:33:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.929 10:33:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:02.496 10:33:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:02.754 [2024-07-15 10:33:39.887250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:02.754 [2024-07-15 10:33:39.887301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:02.754 [2024-07-15 10:33:39.887322] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c3b310 00:26:02.754 [2024-07-15 10:33:39.887335] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:02.754 [2024-07-15 10:33:39.887708] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:02.754 [2024-07-15 10:33:39.887726] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:02.754 [2024-07-15 10:33:39.887801] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:02.754 [2024-07-15 10:33:39.887814] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:02.754 [2024-07-15 10:33:39.887826] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:02.754 [2024-07-15 10:33:39.887845] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:02.754 [2024-07-15 10:33:39.891800] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c41580 00:26:02.754 spare 00:26:02.754 [2024-07-15 10:33:39.893301] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:02.754 10:33:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:04.130 10:33:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:04.130 10:33:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.130 10:33:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:04.130 10:33:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:04.130 10:33:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.130 10:33:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.130 10:33:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.130 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.130 "name": "raid_bdev1", 00:26:04.130 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:04.130 "strip_size_kb": 0, 00:26:04.130 "state": "online", 00:26:04.130 "raid_level": "raid1", 00:26:04.130 "superblock": true, 00:26:04.130 "num_base_bdevs": 4, 00:26:04.130 "num_base_bdevs_discovered": 3, 00:26:04.130 "num_base_bdevs_operational": 3, 00:26:04.130 "process": { 00:26:04.130 "type": "rebuild", 00:26:04.130 "target": "spare", 00:26:04.130 "progress": { 00:26:04.130 "blocks": 24576, 00:26:04.130 "percent": 38 00:26:04.130 } 00:26:04.130 }, 00:26:04.130 "base_bdevs_list": [ 00:26:04.130 { 00:26:04.130 "name": "spare", 00:26:04.130 "uuid": "2a4e646d-fabd-52ad-8bbb-9de001b4d8b6", 00:26:04.130 "is_configured": true, 00:26:04.130 "data_offset": 2048, 00:26:04.130 "data_size": 63488 00:26:04.130 }, 00:26:04.130 { 00:26:04.130 "name": null, 00:26:04.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.130 "is_configured": false, 00:26:04.130 "data_offset": 2048, 00:26:04.130 "data_size": 63488 00:26:04.130 }, 00:26:04.130 { 00:26:04.130 "name": "BaseBdev3", 00:26:04.130 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:04.130 "is_configured": true, 00:26:04.130 "data_offset": 2048, 00:26:04.130 "data_size": 63488 00:26:04.130 }, 00:26:04.130 { 00:26:04.130 "name": "BaseBdev4", 00:26:04.130 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:04.130 "is_configured": true, 00:26:04.130 "data_offset": 2048, 00:26:04.130 "data_size": 63488 00:26:04.130 } 00:26:04.130 ] 00:26:04.130 }' 00:26:04.130 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.130 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:04.130 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.130 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:04.130 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:04.389 [2024-07-15 10:33:41.477566] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:04.389 [2024-07-15 10:33:41.506085] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:04.389 [2024-07-15 10:33:41.506128] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:04.389 [2024-07-15 10:33:41.506144] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:04.389 [2024-07-15 10:33:41.506152] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:04.389 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:04.390 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:04.390 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.390 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.648 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:04.648 "name": "raid_bdev1", 00:26:04.648 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:04.648 "strip_size_kb": 0, 00:26:04.648 "state": "online", 00:26:04.648 "raid_level": "raid1", 00:26:04.648 "superblock": true, 00:26:04.648 "num_base_bdevs": 4, 00:26:04.648 "num_base_bdevs_discovered": 2, 00:26:04.648 "num_base_bdevs_operational": 2, 00:26:04.648 "base_bdevs_list": [ 00:26:04.648 { 00:26:04.648 "name": null, 00:26:04.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.648 "is_configured": false, 00:26:04.648 "data_offset": 2048, 00:26:04.648 "data_size": 63488 00:26:04.648 }, 00:26:04.648 { 00:26:04.648 "name": null, 00:26:04.648 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.648 "is_configured": false, 00:26:04.648 "data_offset": 2048, 00:26:04.648 "data_size": 63488 00:26:04.648 }, 00:26:04.648 { 00:26:04.648 "name": "BaseBdev3", 00:26:04.648 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:04.648 "is_configured": true, 00:26:04.648 "data_offset": 2048, 00:26:04.648 "data_size": 63488 00:26:04.648 }, 00:26:04.648 { 00:26:04.648 "name": "BaseBdev4", 00:26:04.648 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:04.648 "is_configured": true, 00:26:04.648 "data_offset": 2048, 00:26:04.648 "data_size": 63488 00:26:04.648 } 00:26:04.648 ] 00:26:04.648 }' 00:26:04.648 10:33:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:04.648 10:33:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:05.215 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:05.215 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:05.215 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:05.215 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:05.215 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:05.215 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.215 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:05.473 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:05.473 "name": "raid_bdev1", 00:26:05.473 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:05.473 "strip_size_kb": 0, 00:26:05.473 "state": "online", 00:26:05.473 "raid_level": "raid1", 00:26:05.473 "superblock": true, 00:26:05.473 "num_base_bdevs": 4, 00:26:05.473 "num_base_bdevs_discovered": 2, 00:26:05.473 "num_base_bdevs_operational": 2, 00:26:05.473 "base_bdevs_list": [ 00:26:05.473 { 00:26:05.473 "name": null, 00:26:05.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.473 "is_configured": false, 00:26:05.473 "data_offset": 2048, 00:26:05.473 "data_size": 63488 00:26:05.473 }, 00:26:05.473 { 00:26:05.473 "name": null, 00:26:05.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:05.473 "is_configured": false, 00:26:05.473 "data_offset": 2048, 00:26:05.473 "data_size": 63488 00:26:05.473 }, 00:26:05.473 { 00:26:05.473 "name": "BaseBdev3", 00:26:05.473 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:05.473 "is_configured": true, 00:26:05.473 "data_offset": 2048, 00:26:05.473 "data_size": 63488 00:26:05.473 }, 00:26:05.473 { 00:26:05.473 "name": "BaseBdev4", 00:26:05.473 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:05.473 "is_configured": true, 00:26:05.473 "data_offset": 2048, 00:26:05.473 "data_size": 63488 00:26:05.473 } 00:26:05.473 ] 00:26:05.473 }' 00:26:05.473 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:05.473 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:05.473 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:05.731 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:05.731 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:05.990 10:33:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:05.990 [2024-07-15 10:33:43.159150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:05.990 [2024-07-15 10:33:43.159198] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:05.990 [2024-07-15 10:33:43.159219] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc7710 00:26:05.990 [2024-07-15 10:33:43.159232] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:05.990 [2024-07-15 10:33:43.159575] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:05.990 [2024-07-15 10:33:43.159594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:05.990 [2024-07-15 10:33:43.159659] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:05.990 [2024-07-15 10:33:43.159672] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:05.990 [2024-07-15 10:33:43.159683] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:05.990 BaseBdev1 00:26:05.990 10:33:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:07.361 "name": "raid_bdev1", 00:26:07.361 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:07.361 "strip_size_kb": 0, 00:26:07.361 "state": "online", 00:26:07.361 "raid_level": "raid1", 00:26:07.361 "superblock": true, 00:26:07.361 "num_base_bdevs": 4, 00:26:07.361 "num_base_bdevs_discovered": 2, 00:26:07.361 "num_base_bdevs_operational": 2, 00:26:07.361 "base_bdevs_list": [ 00:26:07.361 { 00:26:07.361 "name": null, 00:26:07.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.361 "is_configured": false, 00:26:07.361 "data_offset": 2048, 00:26:07.361 "data_size": 63488 00:26:07.361 }, 00:26:07.361 { 00:26:07.361 "name": null, 00:26:07.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.361 "is_configured": false, 00:26:07.361 "data_offset": 2048, 00:26:07.361 "data_size": 63488 00:26:07.361 }, 00:26:07.361 { 00:26:07.361 "name": "BaseBdev3", 00:26:07.361 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:07.361 "is_configured": true, 00:26:07.361 "data_offset": 2048, 00:26:07.361 "data_size": 63488 00:26:07.361 }, 00:26:07.361 { 00:26:07.361 "name": "BaseBdev4", 00:26:07.361 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:07.361 "is_configured": true, 00:26:07.361 "data_offset": 2048, 00:26:07.361 "data_size": 63488 00:26:07.361 } 00:26:07.361 ] 00:26:07.361 }' 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:07.361 10:33:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:07.928 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:07.928 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.928 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:07.928 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:07.928 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.928 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.928 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.235 "name": "raid_bdev1", 00:26:08.235 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:08.235 "strip_size_kb": 0, 00:26:08.235 "state": "online", 00:26:08.235 "raid_level": "raid1", 00:26:08.235 "superblock": true, 00:26:08.235 "num_base_bdevs": 4, 00:26:08.235 "num_base_bdevs_discovered": 2, 00:26:08.235 "num_base_bdevs_operational": 2, 00:26:08.235 "base_bdevs_list": [ 00:26:08.235 { 00:26:08.235 "name": null, 00:26:08.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.235 "is_configured": false, 00:26:08.235 "data_offset": 2048, 00:26:08.235 "data_size": 63488 00:26:08.235 }, 00:26:08.235 { 00:26:08.235 "name": null, 00:26:08.235 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.235 "is_configured": false, 00:26:08.235 "data_offset": 2048, 00:26:08.235 "data_size": 63488 00:26:08.235 }, 00:26:08.235 { 00:26:08.235 "name": "BaseBdev3", 00:26:08.235 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:08.235 "is_configured": true, 00:26:08.235 "data_offset": 2048, 00:26:08.235 "data_size": 63488 00:26:08.235 }, 00:26:08.235 { 00:26:08.235 "name": "BaseBdev4", 00:26:08.235 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:08.235 "is_configured": true, 00:26:08.235 "data_offset": 2048, 00:26:08.235 "data_size": 63488 00:26:08.235 } 00:26:08.235 ] 00:26:08.235 }' 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:08.235 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:08.492 [2024-07-15 10:33:45.573741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:08.492 [2024-07-15 10:33:45.573877] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:08.492 [2024-07-15 10:33:45.573894] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:08.492 request: 00:26:08.492 { 00:26:08.492 "base_bdev": "BaseBdev1", 00:26:08.492 "raid_bdev": "raid_bdev1", 00:26:08.492 "method": "bdev_raid_add_base_bdev", 00:26:08.492 "req_id": 1 00:26:08.492 } 00:26:08.492 Got JSON-RPC error response 00:26:08.492 response: 00:26:08.492 { 00:26:08.492 "code": -22, 00:26:08.492 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:08.492 } 00:26:08.492 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:26:08.492 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:08.492 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:08.492 10:33:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:08.492 10:33:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.426 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:09.684 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:09.684 "name": "raid_bdev1", 00:26:09.684 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:09.684 "strip_size_kb": 0, 00:26:09.684 "state": "online", 00:26:09.684 "raid_level": "raid1", 00:26:09.684 "superblock": true, 00:26:09.684 "num_base_bdevs": 4, 00:26:09.684 "num_base_bdevs_discovered": 2, 00:26:09.684 "num_base_bdevs_operational": 2, 00:26:09.684 "base_bdevs_list": [ 00:26:09.684 { 00:26:09.684 "name": null, 00:26:09.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.684 "is_configured": false, 00:26:09.684 "data_offset": 2048, 00:26:09.684 "data_size": 63488 00:26:09.684 }, 00:26:09.684 { 00:26:09.684 "name": null, 00:26:09.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.684 "is_configured": false, 00:26:09.684 "data_offset": 2048, 00:26:09.684 "data_size": 63488 00:26:09.684 }, 00:26:09.684 { 00:26:09.684 "name": "BaseBdev3", 00:26:09.684 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:09.684 "is_configured": true, 00:26:09.684 "data_offset": 2048, 00:26:09.684 "data_size": 63488 00:26:09.684 }, 00:26:09.684 { 00:26:09.684 "name": "BaseBdev4", 00:26:09.684 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:09.684 "is_configured": true, 00:26:09.684 "data_offset": 2048, 00:26:09.684 "data_size": 63488 00:26:09.684 } 00:26:09.684 ] 00:26:09.684 }' 00:26:09.684 10:33:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:09.684 10:33:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:10.250 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:10.250 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:10.250 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:10.250 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:10.250 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:10.250 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.250 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.508 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:10.508 "name": "raid_bdev1", 00:26:10.508 "uuid": "ffcd5998-829e-4ff4-9b54-a9f1dbdb9b96", 00:26:10.508 "strip_size_kb": 0, 00:26:10.508 "state": "online", 00:26:10.508 "raid_level": "raid1", 00:26:10.508 "superblock": true, 00:26:10.508 "num_base_bdevs": 4, 00:26:10.508 "num_base_bdevs_discovered": 2, 00:26:10.508 "num_base_bdevs_operational": 2, 00:26:10.508 "base_bdevs_list": [ 00:26:10.508 { 00:26:10.508 "name": null, 00:26:10.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.508 "is_configured": false, 00:26:10.508 "data_offset": 2048, 00:26:10.508 "data_size": 63488 00:26:10.508 }, 00:26:10.508 { 00:26:10.508 "name": null, 00:26:10.508 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:10.508 "is_configured": false, 00:26:10.508 "data_offset": 2048, 00:26:10.508 "data_size": 63488 00:26:10.508 }, 00:26:10.508 { 00:26:10.508 "name": "BaseBdev3", 00:26:10.508 "uuid": "4a9c53d9-6098-5e2d-b8f2-1d967452e960", 00:26:10.508 "is_configured": true, 00:26:10.508 "data_offset": 2048, 00:26:10.508 "data_size": 63488 00:26:10.508 }, 00:26:10.508 { 00:26:10.508 "name": "BaseBdev4", 00:26:10.508 "uuid": "43ea9906-4f7f-53dc-803d-b01dd03374ec", 00:26:10.508 "is_configured": true, 00:26:10.508 "data_offset": 2048, 00:26:10.508 "data_size": 63488 00:26:10.508 } 00:26:10.508 ] 00:26:10.508 }' 00:26:10.508 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:10.508 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 599566 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 599566 ']' 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 599566 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 599566 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 599566' 00:26:10.766 killing process with pid 599566 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 599566 00:26:10.766 Received shutdown signal, test time was about 60.000000 seconds 00:26:10.766 00:26:10.766 Latency(us) 00:26:10.766 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:10.766 =================================================================================================================== 00:26:10.766 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:10.766 [2024-07-15 10:33:47.798215] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:10.766 [2024-07-15 10:33:47.798310] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:10.766 10:33:47 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 599566 00:26:10.766 [2024-07-15 10:33:47.798379] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:10.766 [2024-07-15 10:33:47.798394] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bc2970 name raid_bdev1, state offline 00:26:10.766 [2024-07-15 10:33:47.847775] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:26:11.024 00:26:11.024 real 0m37.648s 00:26:11.024 user 0m54.554s 00:26:11.024 sys 0m6.724s 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:11.024 ************************************ 00:26:11.024 END TEST raid_rebuild_test_sb 00:26:11.024 ************************************ 00:26:11.024 10:33:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:11.024 10:33:48 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:26:11.024 10:33:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:11.024 10:33:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:11.024 10:33:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:11.024 ************************************ 00:26:11.024 START TEST raid_rebuild_test_io 00:26:11.024 ************************************ 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=604819 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 604819 /var/tmp/spdk-raid.sock 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 604819 ']' 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:11.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:11.024 10:33:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:11.281 [2024-07-15 10:33:48.230681] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:26:11.281 [2024-07-15 10:33:48.230752] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid604819 ] 00:26:11.281 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:11.281 Zero copy mechanism will not be used. 00:26:11.281 [2024-07-15 10:33:48.357128] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:11.281 [2024-07-15 10:33:48.453627] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:11.538 [2024-07-15 10:33:48.513015] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:11.538 [2024-07-15 10:33:48.513051] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:12.101 10:33:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:12.101 10:33:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:26:12.101 10:33:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:12.101 10:33:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:12.358 BaseBdev1_malloc 00:26:12.358 10:33:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:12.615 [2024-07-15 10:33:49.565072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:12.615 [2024-07-15 10:33:49.565119] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:12.615 [2024-07-15 10:33:49.565143] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24afd40 00:26:12.615 [2024-07-15 10:33:49.565156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:12.615 [2024-07-15 10:33:49.566901] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:12.615 [2024-07-15 10:33:49.566938] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:12.615 BaseBdev1 00:26:12.615 10:33:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:12.615 10:33:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:12.872 BaseBdev2_malloc 00:26:12.872 10:33:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:12.872 [2024-07-15 10:33:50.068464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:12.872 [2024-07-15 10:33:50.068516] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:12.872 [2024-07-15 10:33:50.068539] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24b0860 00:26:12.872 [2024-07-15 10:33:50.068552] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:12.872 [2024-07-15 10:33:50.070154] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:12.872 [2024-07-15 10:33:50.070183] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:13.129 BaseBdev2 00:26:13.129 10:33:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:13.129 10:33:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:13.129 BaseBdev3_malloc 00:26:13.386 10:33:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:13.386 [2024-07-15 10:33:50.559031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:13.386 [2024-07-15 10:33:50.559077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:13.386 [2024-07-15 10:33:50.559096] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265d8f0 00:26:13.387 [2024-07-15 10:33:50.559109] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:13.387 [2024-07-15 10:33:50.560621] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:13.387 [2024-07-15 10:33:50.560649] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:13.387 BaseBdev3 00:26:13.387 10:33:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:13.387 10:33:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:13.643 BaseBdev4_malloc 00:26:13.643 10:33:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:13.899 [2024-07-15 10:33:51.044879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:13.899 [2024-07-15 10:33:51.044923] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:13.899 [2024-07-15 10:33:51.044949] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x265cad0 00:26:13.899 [2024-07-15 10:33:51.044961] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:13.899 [2024-07-15 10:33:51.046452] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:13.899 [2024-07-15 10:33:51.046487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:13.899 BaseBdev4 00:26:13.899 10:33:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:14.155 spare_malloc 00:26:14.155 10:33:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:14.413 spare_delay 00:26:14.413 10:33:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:14.670 [2024-07-15 10:33:51.787530] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:14.670 [2024-07-15 10:33:51.787580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:14.670 [2024-07-15 10:33:51.787601] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26615b0 00:26:14.670 [2024-07-15 10:33:51.787614] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:14.670 [2024-07-15 10:33:51.789238] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:14.670 [2024-07-15 10:33:51.789265] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:14.670 spare 00:26:14.670 10:33:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:14.928 [2024-07-15 10:33:52.028196] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:14.928 [2024-07-15 10:33:52.029564] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:14.928 [2024-07-15 10:33:52.029620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:14.928 [2024-07-15 10:33:52.029666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:14.928 [2024-07-15 10:33:52.029752] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25e08a0 00:26:14.928 [2024-07-15 10:33:52.029762] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:14.928 [2024-07-15 10:33:52.029993] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x265ae10 00:26:14.928 [2024-07-15 10:33:52.030149] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25e08a0 00:26:14.928 [2024-07-15 10:33:52.030160] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25e08a0 00:26:14.928 [2024-07-15 10:33:52.030283] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.928 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.185 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:15.185 "name": "raid_bdev1", 00:26:15.185 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:15.185 "strip_size_kb": 0, 00:26:15.185 "state": "online", 00:26:15.185 "raid_level": "raid1", 00:26:15.185 "superblock": false, 00:26:15.185 "num_base_bdevs": 4, 00:26:15.185 "num_base_bdevs_discovered": 4, 00:26:15.185 "num_base_bdevs_operational": 4, 00:26:15.185 "base_bdevs_list": [ 00:26:15.185 { 00:26:15.185 "name": "BaseBdev1", 00:26:15.185 "uuid": "16f3891b-60c7-55f9-a134-64552ca81e23", 00:26:15.186 "is_configured": true, 00:26:15.186 "data_offset": 0, 00:26:15.186 "data_size": 65536 00:26:15.186 }, 00:26:15.186 { 00:26:15.186 "name": "BaseBdev2", 00:26:15.186 "uuid": "898352a7-8f6e-58dd-bba7-201693b57542", 00:26:15.186 "is_configured": true, 00:26:15.186 "data_offset": 0, 00:26:15.186 "data_size": 65536 00:26:15.186 }, 00:26:15.186 { 00:26:15.186 "name": "BaseBdev3", 00:26:15.186 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:15.186 "is_configured": true, 00:26:15.186 "data_offset": 0, 00:26:15.186 "data_size": 65536 00:26:15.186 }, 00:26:15.186 { 00:26:15.186 "name": "BaseBdev4", 00:26:15.186 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:15.186 "is_configured": true, 00:26:15.186 "data_offset": 0, 00:26:15.186 "data_size": 65536 00:26:15.186 } 00:26:15.186 ] 00:26:15.186 }' 00:26:15.186 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:15.186 10:33:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:15.751 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:15.751 10:33:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:16.008 [2024-07-15 10:33:53.167494] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:16.008 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:26:16.008 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:16.008 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.266 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:26:16.266 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:16.266 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:16.266 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:16.523 [2024-07-15 10:33:53.538261] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e6970 00:26:16.523 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:16.523 Zero copy mechanism will not be used. 00:26:16.523 Running I/O for 60 seconds... 00:26:16.523 [2024-07-15 10:33:53.666644] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:16.523 [2024-07-15 10:33:53.682800] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25e6970 00:26:16.523 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:16.523 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:16.523 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.781 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.039 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.039 "name": "raid_bdev1", 00:26:17.039 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:17.039 "strip_size_kb": 0, 00:26:17.039 "state": "online", 00:26:17.039 "raid_level": "raid1", 00:26:17.039 "superblock": false, 00:26:17.039 "num_base_bdevs": 4, 00:26:17.039 "num_base_bdevs_discovered": 3, 00:26:17.039 "num_base_bdevs_operational": 3, 00:26:17.039 "base_bdevs_list": [ 00:26:17.039 { 00:26:17.039 "name": null, 00:26:17.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.039 "is_configured": false, 00:26:17.039 "data_offset": 0, 00:26:17.039 "data_size": 65536 00:26:17.039 }, 00:26:17.039 { 00:26:17.039 "name": "BaseBdev2", 00:26:17.039 "uuid": "898352a7-8f6e-58dd-bba7-201693b57542", 00:26:17.039 "is_configured": true, 00:26:17.039 "data_offset": 0, 00:26:17.039 "data_size": 65536 00:26:17.039 }, 00:26:17.039 { 00:26:17.039 "name": "BaseBdev3", 00:26:17.039 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:17.039 "is_configured": true, 00:26:17.039 "data_offset": 0, 00:26:17.039 "data_size": 65536 00:26:17.039 }, 00:26:17.039 { 00:26:17.039 "name": "BaseBdev4", 00:26:17.039 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:17.039 "is_configured": true, 00:26:17.039 "data_offset": 0, 00:26:17.039 "data_size": 65536 00:26:17.039 } 00:26:17.039 ] 00:26:17.039 }' 00:26:17.039 10:33:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.039 10:33:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:17.606 10:33:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:17.863 [2024-07-15 10:33:54.832774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:17.863 10:33:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:17.863 [2024-07-15 10:33:54.889142] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21b6fa0 00:26:17.863 [2024-07-15 10:33:54.891545] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:17.864 [2024-07-15 10:33:55.000810] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:17.864 [2024-07-15 10:33:55.001910] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:18.122 [2024-07-15 10:33:55.214804] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:18.122 [2024-07-15 10:33:55.214952] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:18.379 [2024-07-15 10:33:55.569788] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:18.379 [2024-07-15 10:33:55.570113] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:18.637 [2024-07-15 10:33:55.781490] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:18.637 [2024-07-15 10:33:55.782163] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:18.895 10:33:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:18.895 10:33:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:18.895 10:33:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:18.895 10:33:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:18.895 10:33:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:18.895 10:33:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.895 10:33:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.153 [2024-07-15 10:33:56.137192] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:19.153 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:19.153 "name": "raid_bdev1", 00:26:19.153 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:19.153 "strip_size_kb": 0, 00:26:19.153 "state": "online", 00:26:19.153 "raid_level": "raid1", 00:26:19.153 "superblock": false, 00:26:19.153 "num_base_bdevs": 4, 00:26:19.153 "num_base_bdevs_discovered": 4, 00:26:19.153 "num_base_bdevs_operational": 4, 00:26:19.153 "process": { 00:26:19.153 "type": "rebuild", 00:26:19.153 "target": "spare", 00:26:19.153 "progress": { 00:26:19.153 "blocks": 12288, 00:26:19.153 "percent": 18 00:26:19.153 } 00:26:19.153 }, 00:26:19.153 "base_bdevs_list": [ 00:26:19.153 { 00:26:19.153 "name": "spare", 00:26:19.153 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:19.153 "is_configured": true, 00:26:19.153 "data_offset": 0, 00:26:19.153 "data_size": 65536 00:26:19.153 }, 00:26:19.153 { 00:26:19.153 "name": "BaseBdev2", 00:26:19.153 "uuid": "898352a7-8f6e-58dd-bba7-201693b57542", 00:26:19.153 "is_configured": true, 00:26:19.153 "data_offset": 0, 00:26:19.153 "data_size": 65536 00:26:19.153 }, 00:26:19.153 { 00:26:19.153 "name": "BaseBdev3", 00:26:19.153 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:19.153 "is_configured": true, 00:26:19.153 "data_offset": 0, 00:26:19.153 "data_size": 65536 00:26:19.153 }, 00:26:19.153 { 00:26:19.153 "name": "BaseBdev4", 00:26:19.153 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:19.153 "is_configured": true, 00:26:19.153 "data_offset": 0, 00:26:19.153 "data_size": 65536 00:26:19.153 } 00:26:19.153 ] 00:26:19.153 }' 00:26:19.153 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:19.154 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:19.154 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:19.154 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:19.154 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:19.154 [2024-07-15 10:33:56.348194] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:19.412 [2024-07-15 10:33:56.475183] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:19.412 [2024-07-15 10:33:56.573600] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:19.412 [2024-07-15 10:33:56.575364] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:19.412 [2024-07-15 10:33:56.575393] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:19.412 [2024-07-15 10:33:56.575403] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:19.412 [2024-07-15 10:33:56.581110] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25e6970 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.671 "name": "raid_bdev1", 00:26:19.671 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:19.671 "strip_size_kb": 0, 00:26:19.671 "state": "online", 00:26:19.671 "raid_level": "raid1", 00:26:19.671 "superblock": false, 00:26:19.671 "num_base_bdevs": 4, 00:26:19.671 "num_base_bdevs_discovered": 3, 00:26:19.671 "num_base_bdevs_operational": 3, 00:26:19.671 "base_bdevs_list": [ 00:26:19.671 { 00:26:19.671 "name": null, 00:26:19.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.671 "is_configured": false, 00:26:19.671 "data_offset": 0, 00:26:19.671 "data_size": 65536 00:26:19.671 }, 00:26:19.671 { 00:26:19.671 "name": "BaseBdev2", 00:26:19.671 "uuid": "898352a7-8f6e-58dd-bba7-201693b57542", 00:26:19.671 "is_configured": true, 00:26:19.671 "data_offset": 0, 00:26:19.671 "data_size": 65536 00:26:19.671 }, 00:26:19.671 { 00:26:19.671 "name": "BaseBdev3", 00:26:19.671 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:19.671 "is_configured": true, 00:26:19.671 "data_offset": 0, 00:26:19.671 "data_size": 65536 00:26:19.671 }, 00:26:19.671 { 00:26:19.671 "name": "BaseBdev4", 00:26:19.671 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:19.671 "is_configured": true, 00:26:19.671 "data_offset": 0, 00:26:19.671 "data_size": 65536 00:26:19.671 } 00:26:19.671 ] 00:26:19.671 }' 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.671 10:33:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.605 "name": "raid_bdev1", 00:26:20.605 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:20.605 "strip_size_kb": 0, 00:26:20.605 "state": "online", 00:26:20.605 "raid_level": "raid1", 00:26:20.605 "superblock": false, 00:26:20.605 "num_base_bdevs": 4, 00:26:20.605 "num_base_bdevs_discovered": 3, 00:26:20.605 "num_base_bdevs_operational": 3, 00:26:20.605 "base_bdevs_list": [ 00:26:20.605 { 00:26:20.605 "name": null, 00:26:20.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.605 "is_configured": false, 00:26:20.605 "data_offset": 0, 00:26:20.605 "data_size": 65536 00:26:20.605 }, 00:26:20.605 { 00:26:20.605 "name": "BaseBdev2", 00:26:20.605 "uuid": "898352a7-8f6e-58dd-bba7-201693b57542", 00:26:20.605 "is_configured": true, 00:26:20.605 "data_offset": 0, 00:26:20.605 "data_size": 65536 00:26:20.605 }, 00:26:20.605 { 00:26:20.605 "name": "BaseBdev3", 00:26:20.605 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:20.605 "is_configured": true, 00:26:20.605 "data_offset": 0, 00:26:20.605 "data_size": 65536 00:26:20.605 }, 00:26:20.605 { 00:26:20.605 "name": "BaseBdev4", 00:26:20.605 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:20.605 "is_configured": true, 00:26:20.605 "data_offset": 0, 00:26:20.605 "data_size": 65536 00:26:20.605 } 00:26:20.605 ] 00:26:20.605 }' 00:26:20.605 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.863 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:20.864 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.864 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:20.864 10:33:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:21.431 [2024-07-15 10:33:58.353094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:21.431 10:33:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:21.431 [2024-07-15 10:33:58.430052] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2654f10 00:26:21.431 [2024-07-15 10:33:58.431615] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:21.431 [2024-07-15 10:33:58.533015] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:21.431 [2024-07-15 10:33:58.533322] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:21.690 [2024-07-15 10:33:58.664661] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:21.948 [2024-07-15 10:33:59.039726] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:22.206 [2024-07-15 10:33:59.251264] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:22.206 [2024-07-15 10:33:59.251655] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.507 [2024-07-15 10:33:59.603844] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:22.507 "name": "raid_bdev1", 00:26:22.507 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:22.507 "strip_size_kb": 0, 00:26:22.507 "state": "online", 00:26:22.507 "raid_level": "raid1", 00:26:22.507 "superblock": false, 00:26:22.507 "num_base_bdevs": 4, 00:26:22.507 "num_base_bdevs_discovered": 4, 00:26:22.507 "num_base_bdevs_operational": 4, 00:26:22.507 "process": { 00:26:22.507 "type": "rebuild", 00:26:22.507 "target": "spare", 00:26:22.507 "progress": { 00:26:22.507 "blocks": 14336, 00:26:22.507 "percent": 21 00:26:22.507 } 00:26:22.507 }, 00:26:22.507 "base_bdevs_list": [ 00:26:22.507 { 00:26:22.507 "name": "spare", 00:26:22.507 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:22.507 "is_configured": true, 00:26:22.507 "data_offset": 0, 00:26:22.507 "data_size": 65536 00:26:22.507 }, 00:26:22.507 { 00:26:22.507 "name": "BaseBdev2", 00:26:22.507 "uuid": "898352a7-8f6e-58dd-bba7-201693b57542", 00:26:22.507 "is_configured": true, 00:26:22.507 "data_offset": 0, 00:26:22.507 "data_size": 65536 00:26:22.507 }, 00:26:22.507 { 00:26:22.507 "name": "BaseBdev3", 00:26:22.507 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:22.507 "is_configured": true, 00:26:22.507 "data_offset": 0, 00:26:22.507 "data_size": 65536 00:26:22.507 }, 00:26:22.507 { 00:26:22.507 "name": "BaseBdev4", 00:26:22.507 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:22.507 "is_configured": true, 00:26:22.507 "data_offset": 0, 00:26:22.507 "data_size": 65536 00:26:22.507 } 00:26:22.507 ] 00:26:22.507 }' 00:26:22.507 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:22.766 10:33:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:23.333 [2024-07-15 10:34:00.228953] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:23.333 [2024-07-15 10:34:00.309098] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:23.333 [2024-07-15 10:34:00.310248] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:23.333 [2024-07-15 10:34:00.419733] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x25e6970 00:26:23.333 [2024-07-15 10:34:00.419758] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2654f10 00:26:23.333 [2024-07-15 10:34:00.420299] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.333 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.591 [2024-07-15 10:34:00.650248] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:23.591 [2024-07-15 10:34:00.650762] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:23.591 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.591 "name": "raid_bdev1", 00:26:23.591 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:23.591 "strip_size_kb": 0, 00:26:23.591 "state": "online", 00:26:23.591 "raid_level": "raid1", 00:26:23.591 "superblock": false, 00:26:23.591 "num_base_bdevs": 4, 00:26:23.591 "num_base_bdevs_discovered": 3, 00:26:23.591 "num_base_bdevs_operational": 3, 00:26:23.591 "process": { 00:26:23.591 "type": "rebuild", 00:26:23.591 "target": "spare", 00:26:23.591 "progress": { 00:26:23.591 "blocks": 28672, 00:26:23.591 "percent": 43 00:26:23.591 } 00:26:23.591 }, 00:26:23.591 "base_bdevs_list": [ 00:26:23.591 { 00:26:23.591 "name": "spare", 00:26:23.591 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:23.591 "is_configured": true, 00:26:23.591 "data_offset": 0, 00:26:23.591 "data_size": 65536 00:26:23.591 }, 00:26:23.591 { 00:26:23.591 "name": null, 00:26:23.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.591 "is_configured": false, 00:26:23.591 "data_offset": 0, 00:26:23.591 "data_size": 65536 00:26:23.591 }, 00:26:23.591 { 00:26:23.591 "name": "BaseBdev3", 00:26:23.591 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:23.591 "is_configured": true, 00:26:23.591 "data_offset": 0, 00:26:23.591 "data_size": 65536 00:26:23.591 }, 00:26:23.591 { 00:26:23.591 "name": "BaseBdev4", 00:26:23.591 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:23.591 "is_configured": true, 00:26:23.591 "data_offset": 0, 00:26:23.591 "data_size": 65536 00:26:23.591 } 00:26:23.591 ] 00:26:23.591 }' 00:26:23.591 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.591 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:23.591 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=925 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:23.850 10:34:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.415 10:34:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:24.415 "name": "raid_bdev1", 00:26:24.415 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:24.415 "strip_size_kb": 0, 00:26:24.415 "state": "online", 00:26:24.415 "raid_level": "raid1", 00:26:24.415 "superblock": false, 00:26:24.415 "num_base_bdevs": 4, 00:26:24.415 "num_base_bdevs_discovered": 3, 00:26:24.415 "num_base_bdevs_operational": 3, 00:26:24.415 "process": { 00:26:24.415 "type": "rebuild", 00:26:24.415 "target": "spare", 00:26:24.415 "progress": { 00:26:24.415 "blocks": 36864, 00:26:24.415 "percent": 56 00:26:24.415 } 00:26:24.415 }, 00:26:24.415 "base_bdevs_list": [ 00:26:24.415 { 00:26:24.415 "name": "spare", 00:26:24.415 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:24.415 "is_configured": true, 00:26:24.415 "data_offset": 0, 00:26:24.415 "data_size": 65536 00:26:24.415 }, 00:26:24.415 { 00:26:24.415 "name": null, 00:26:24.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.415 "is_configured": false, 00:26:24.415 "data_offset": 0, 00:26:24.415 "data_size": 65536 00:26:24.415 }, 00:26:24.415 { 00:26:24.415 "name": "BaseBdev3", 00:26:24.415 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:24.415 "is_configured": true, 00:26:24.415 "data_offset": 0, 00:26:24.415 "data_size": 65536 00:26:24.415 }, 00:26:24.415 { 00:26:24.415 "name": "BaseBdev4", 00:26:24.415 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:24.415 "is_configured": true, 00:26:24.415 "data_offset": 0, 00:26:24.415 "data_size": 65536 00:26:24.415 } 00:26:24.415 ] 00:26:24.415 }' 00:26:24.415 10:34:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:24.415 [2024-07-15 10:34:01.364819] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:24.415 10:34:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:24.415 10:34:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:24.415 10:34:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:24.415 10:34:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:24.673 [2024-07-15 10:34:01.798871] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:26:24.932 [2024-07-15 10:34:01.900746] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:24.932 [2024-07-15 10:34:01.900936] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.500 "name": "raid_bdev1", 00:26:25.500 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:25.500 "strip_size_kb": 0, 00:26:25.500 "state": "online", 00:26:25.500 "raid_level": "raid1", 00:26:25.500 "superblock": false, 00:26:25.500 "num_base_bdevs": 4, 00:26:25.500 "num_base_bdevs_discovered": 3, 00:26:25.500 "num_base_bdevs_operational": 3, 00:26:25.500 "process": { 00:26:25.500 "type": "rebuild", 00:26:25.500 "target": "spare", 00:26:25.500 "progress": { 00:26:25.500 "blocks": 57344, 00:26:25.500 "percent": 87 00:26:25.500 } 00:26:25.500 }, 00:26:25.500 "base_bdevs_list": [ 00:26:25.500 { 00:26:25.500 "name": "spare", 00:26:25.500 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:25.500 "is_configured": true, 00:26:25.500 "data_offset": 0, 00:26:25.500 "data_size": 65536 00:26:25.500 }, 00:26:25.500 { 00:26:25.500 "name": null, 00:26:25.500 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.500 "is_configured": false, 00:26:25.500 "data_offset": 0, 00:26:25.500 "data_size": 65536 00:26:25.500 }, 00:26:25.500 { 00:26:25.500 "name": "BaseBdev3", 00:26:25.500 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:25.500 "is_configured": true, 00:26:25.500 "data_offset": 0, 00:26:25.500 "data_size": 65536 00:26:25.500 }, 00:26:25.500 { 00:26:25.500 "name": "BaseBdev4", 00:26:25.500 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:25.500 "is_configured": true, 00:26:25.500 "data_offset": 0, 00:26:25.500 "data_size": 65536 00:26:25.500 } 00:26:25.500 ] 00:26:25.500 }' 00:26:25.500 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.758 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:25.758 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:25.758 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:25.758 10:34:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:26.017 [2024-07-15 10:34:03.047397] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:26.017 [2024-07-15 10:34:03.155630] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:26.017 [2024-07-15 10:34:03.158222] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.953 10:34:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.953 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:26.953 "name": "raid_bdev1", 00:26:26.953 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:26.953 "strip_size_kb": 0, 00:26:26.953 "state": "online", 00:26:26.953 "raid_level": "raid1", 00:26:26.953 "superblock": false, 00:26:26.953 "num_base_bdevs": 4, 00:26:26.953 "num_base_bdevs_discovered": 3, 00:26:26.953 "num_base_bdevs_operational": 3, 00:26:26.953 "base_bdevs_list": [ 00:26:26.953 { 00:26:26.953 "name": "spare", 00:26:26.953 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:26.953 "is_configured": true, 00:26:26.953 "data_offset": 0, 00:26:26.953 "data_size": 65536 00:26:26.953 }, 00:26:26.953 { 00:26:26.953 "name": null, 00:26:26.953 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.953 "is_configured": false, 00:26:26.953 "data_offset": 0, 00:26:26.953 "data_size": 65536 00:26:26.953 }, 00:26:26.953 { 00:26:26.953 "name": "BaseBdev3", 00:26:26.953 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:26.953 "is_configured": true, 00:26:26.953 "data_offset": 0, 00:26:26.953 "data_size": 65536 00:26:26.953 }, 00:26:26.953 { 00:26:26.953 "name": "BaseBdev4", 00:26:26.953 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:26.953 "is_configured": true, 00:26:26.953 "data_offset": 0, 00:26:26.953 "data_size": 65536 00:26:26.953 } 00:26:26.953 ] 00:26:26.953 }' 00:26:26.953 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:26.953 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:26.953 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.211 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.469 "name": "raid_bdev1", 00:26:27.469 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:27.469 "strip_size_kb": 0, 00:26:27.469 "state": "online", 00:26:27.469 "raid_level": "raid1", 00:26:27.469 "superblock": false, 00:26:27.469 "num_base_bdevs": 4, 00:26:27.469 "num_base_bdevs_discovered": 3, 00:26:27.469 "num_base_bdevs_operational": 3, 00:26:27.469 "base_bdevs_list": [ 00:26:27.469 { 00:26:27.469 "name": "spare", 00:26:27.469 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:27.469 "is_configured": true, 00:26:27.469 "data_offset": 0, 00:26:27.469 "data_size": 65536 00:26:27.469 }, 00:26:27.469 { 00:26:27.469 "name": null, 00:26:27.469 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.469 "is_configured": false, 00:26:27.469 "data_offset": 0, 00:26:27.469 "data_size": 65536 00:26:27.469 }, 00:26:27.469 { 00:26:27.469 "name": "BaseBdev3", 00:26:27.469 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:27.469 "is_configured": true, 00:26:27.469 "data_offset": 0, 00:26:27.469 "data_size": 65536 00:26:27.469 }, 00:26:27.469 { 00:26:27.469 "name": "BaseBdev4", 00:26:27.469 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:27.469 "is_configured": true, 00:26:27.469 "data_offset": 0, 00:26:27.469 "data_size": 65536 00:26:27.469 } 00:26:27.469 ] 00:26:27.469 }' 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.469 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.727 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.727 "name": "raid_bdev1", 00:26:27.727 "uuid": "e3e2377e-feaf-4e77-87de-c7e4b4d3940d", 00:26:27.727 "strip_size_kb": 0, 00:26:27.727 "state": "online", 00:26:27.727 "raid_level": "raid1", 00:26:27.727 "superblock": false, 00:26:27.727 "num_base_bdevs": 4, 00:26:27.727 "num_base_bdevs_discovered": 3, 00:26:27.727 "num_base_bdevs_operational": 3, 00:26:27.727 "base_bdevs_list": [ 00:26:27.728 { 00:26:27.728 "name": "spare", 00:26:27.728 "uuid": "5ae17867-b61c-5785-88cf-692ccbf9eaa6", 00:26:27.728 "is_configured": true, 00:26:27.728 "data_offset": 0, 00:26:27.728 "data_size": 65536 00:26:27.728 }, 00:26:27.728 { 00:26:27.728 "name": null, 00:26:27.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.728 "is_configured": false, 00:26:27.728 "data_offset": 0, 00:26:27.728 "data_size": 65536 00:26:27.728 }, 00:26:27.728 { 00:26:27.728 "name": "BaseBdev3", 00:26:27.728 "uuid": "61777d70-7da0-52af-b286-2c49b39903df", 00:26:27.728 "is_configured": true, 00:26:27.728 "data_offset": 0, 00:26:27.728 "data_size": 65536 00:26:27.728 }, 00:26:27.728 { 00:26:27.728 "name": "BaseBdev4", 00:26:27.728 "uuid": "97907f11-f90a-5e90-b033-448a8516dbc9", 00:26:27.728 "is_configured": true, 00:26:27.728 "data_offset": 0, 00:26:27.728 "data_size": 65536 00:26:27.728 } 00:26:27.728 ] 00:26:27.728 }' 00:26:27.728 10:34:04 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.728 10:34:04 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:28.295 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:28.554 [2024-07-15 10:34:05.621849] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:28.554 [2024-07-15 10:34:05.621883] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:28.554 00:26:28.554 Latency(us) 00:26:28.554 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:28.554 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:28.554 raid_bdev1 : 12.14 88.50 265.50 0.00 0.00 15160.48 293.84 122181.90 00:26:28.554 =================================================================================================================== 00:26:28.554 Total : 88.50 265.50 0.00 0.00 15160.48 293.84 122181.90 00:26:28.554 [2024-07-15 10:34:05.709997] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:28.554 [2024-07-15 10:34:05.710025] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:28.554 [2024-07-15 10:34:05.710119] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:28.554 [2024-07-15 10:34:05.710132] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e08a0 name raid_bdev1, state offline 00:26:28.554 0 00:26:28.554 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:28.554 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:28.812 10:34:05 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:29.379 /dev/nbd0 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:29.379 1+0 records in 00:26:29.379 1+0 records out 00:26:29.379 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026225 s, 15.6 MB/s 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:29.379 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:29.638 /dev/nbd1 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:29.638 1+0 records in 00:26:29.638 1+0 records out 00:26:29.638 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000387234 s, 10.6 MB/s 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:29.638 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:29.897 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:29.897 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:29.897 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:29.897 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:29.897 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:29.897 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:29.897 10:34:06 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:29.897 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:29.897 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:29.897 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:29.897 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:29.897 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:29.897 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:29.898 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:30.157 /dev/nbd1 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:30.157 1+0 records in 00:26:30.157 1+0 records out 00:26:30.157 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233991 s, 17.5 MB/s 00:26:30.157 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:30.416 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:30.675 10:34:07 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 604819 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 604819 ']' 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 604819 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 604819 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 604819' 00:26:31.241 killing process with pid 604819 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 604819 00:26:31.241 Received shutdown signal, test time was about 14.705496 seconds 00:26:31.241 00:26:31.241 Latency(us) 00:26:31.241 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:31.241 =================================================================================================================== 00:26:31.241 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:31.241 [2024-07-15 10:34:08.282068] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:31.241 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 604819 00:26:31.241 [2024-07-15 10:34:08.328759] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:31.501 00:26:31.501 real 0m20.389s 00:26:31.501 user 0m32.122s 00:26:31.501 sys 0m3.530s 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:26:31.501 ************************************ 00:26:31.501 END TEST raid_rebuild_test_io 00:26:31.501 ************************************ 00:26:31.501 10:34:08 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:31.501 10:34:08 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:26:31.501 10:34:08 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:31.501 10:34:08 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:31.501 10:34:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:31.501 ************************************ 00:26:31.501 START TEST raid_rebuild_test_sb_io 00:26:31.501 ************************************ 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=607697 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 607697 /var/tmp/spdk-raid.sock 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 607697 ']' 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:31.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:31.501 10:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:31.759 [2024-07-15 10:34:08.724476] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:26:31.759 [2024-07-15 10:34:08.724552] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid607697 ] 00:26:31.759 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:31.759 Zero copy mechanism will not be used. 00:26:31.759 [2024-07-15 10:34:08.855579] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:31.759 [2024-07-15 10:34:08.957127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:32.018 [2024-07-15 10:34:09.018520] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:32.018 [2024-07-15 10:34:09.018559] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:32.583 10:34:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:32.583 10:34:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:26:32.583 10:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:32.583 10:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:32.840 BaseBdev1_malloc 00:26:32.840 10:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:32.840 [2024-07-15 10:34:09.974047] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:32.840 [2024-07-15 10:34:09.974101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:32.840 [2024-07-15 10:34:09.974123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bf3d40 00:26:32.840 [2024-07-15 10:34:09.974135] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:32.840 [2024-07-15 10:34:09.975696] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:32.840 [2024-07-15 10:34:09.975724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:32.840 BaseBdev1 00:26:32.840 10:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:32.840 10:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:33.097 BaseBdev2_malloc 00:26:33.097 10:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:33.354 [2024-07-15 10:34:10.396172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:33.354 [2024-07-15 10:34:10.396217] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.354 [2024-07-15 10:34:10.396240] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bf4860 00:26:33.354 [2024-07-15 10:34:10.396253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.354 [2024-07-15 10:34:10.397619] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.354 [2024-07-15 10:34:10.397645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:33.354 BaseBdev2 00:26:33.354 10:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:33.354 10:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:33.611 BaseBdev3_malloc 00:26:33.611 10:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:33.867 [2024-07-15 10:34:10.886073] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:33.867 [2024-07-15 10:34:10.886118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.867 [2024-07-15 10:34:10.886137] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1da18f0 00:26:33.867 [2024-07-15 10:34:10.886150] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.867 [2024-07-15 10:34:10.887529] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.867 [2024-07-15 10:34:10.887561] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:33.867 BaseBdev3 00:26:33.867 10:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:33.867 10:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:34.123 BaseBdev4_malloc 00:26:34.123 10:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:34.123 [2024-07-15 10:34:11.311886] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:34.123 [2024-07-15 10:34:11.311941] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.123 [2024-07-15 10:34:11.311962] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1da0ad0 00:26:34.123 [2024-07-15 10:34:11.311975] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.123 [2024-07-15 10:34:11.313410] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.123 [2024-07-15 10:34:11.313438] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:34.123 BaseBdev4 00:26:34.380 10:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:34.380 spare_malloc 00:26:34.636 10:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:34.636 spare_delay 00:26:34.636 10:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:34.894 [2024-07-15 10:34:11.918195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:34.894 [2024-07-15 10:34:11.918242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:34.894 [2024-07-15 10:34:11.918263] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1da55b0 00:26:34.894 [2024-07-15 10:34:11.918276] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:34.894 [2024-07-15 10:34:11.919773] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:34.894 [2024-07-15 10:34:11.919800] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:34.894 spare 00:26:34.894 10:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:34.894 [2024-07-15 10:34:12.090683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:34.894 [2024-07-15 10:34:12.091851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:34.894 [2024-07-15 10:34:12.091904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:34.894 [2024-07-15 10:34:12.091959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:35.151 [2024-07-15 10:34:12.092149] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d248a0 00:26:35.151 [2024-07-15 10:34:12.092161] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:35.151 [2024-07-15 10:34:12.092356] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d9ee10 00:26:35.151 [2024-07-15 10:34:12.092502] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d248a0 00:26:35.151 [2024-07-15 10:34:12.092512] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d248a0 00:26:35.151 [2024-07-15 10:34:12.092602] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.151 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:35.407 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:35.407 "name": "raid_bdev1", 00:26:35.407 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:35.407 "strip_size_kb": 0, 00:26:35.407 "state": "online", 00:26:35.407 "raid_level": "raid1", 00:26:35.407 "superblock": true, 00:26:35.407 "num_base_bdevs": 4, 00:26:35.407 "num_base_bdevs_discovered": 4, 00:26:35.407 "num_base_bdevs_operational": 4, 00:26:35.407 "base_bdevs_list": [ 00:26:35.407 { 00:26:35.407 "name": "BaseBdev1", 00:26:35.407 "uuid": "294d4e72-5831-5335-84db-ba915496f288", 00:26:35.407 "is_configured": true, 00:26:35.407 "data_offset": 2048, 00:26:35.407 "data_size": 63488 00:26:35.407 }, 00:26:35.407 { 00:26:35.407 "name": "BaseBdev2", 00:26:35.407 "uuid": "87de7b9d-cb00-56a0-8e7f-d8b4b4f33167", 00:26:35.407 "is_configured": true, 00:26:35.407 "data_offset": 2048, 00:26:35.407 "data_size": 63488 00:26:35.407 }, 00:26:35.407 { 00:26:35.407 "name": "BaseBdev3", 00:26:35.407 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:35.407 "is_configured": true, 00:26:35.407 "data_offset": 2048, 00:26:35.407 "data_size": 63488 00:26:35.407 }, 00:26:35.407 { 00:26:35.407 "name": "BaseBdev4", 00:26:35.407 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:35.407 "is_configured": true, 00:26:35.407 "data_offset": 2048, 00:26:35.407 "data_size": 63488 00:26:35.407 } 00:26:35.407 ] 00:26:35.407 }' 00:26:35.407 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:35.407 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:35.971 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:35.971 10:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:36.255 [2024-07-15 10:34:13.185890] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:36.255 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:36.255 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.255 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:36.511 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:36.512 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:26:36.512 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:36.512 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:26:36.512 [2024-07-15 10:34:13.568732] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf3670 00:26:36.512 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:36.512 Zero copy mechanism will not be used. 00:26:36.512 Running I/O for 60 seconds... 00:26:36.769 [2024-07-15 10:34:13.737321] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:36.769 [2024-07-15 10:34:13.737560] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1bf3670 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.769 10:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.026 10:34:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.026 "name": "raid_bdev1", 00:26:37.026 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:37.026 "strip_size_kb": 0, 00:26:37.026 "state": "online", 00:26:37.026 "raid_level": "raid1", 00:26:37.026 "superblock": true, 00:26:37.026 "num_base_bdevs": 4, 00:26:37.026 "num_base_bdevs_discovered": 3, 00:26:37.026 "num_base_bdevs_operational": 3, 00:26:37.026 "base_bdevs_list": [ 00:26:37.026 { 00:26:37.026 "name": null, 00:26:37.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.026 "is_configured": false, 00:26:37.026 "data_offset": 2048, 00:26:37.026 "data_size": 63488 00:26:37.026 }, 00:26:37.026 { 00:26:37.026 "name": "BaseBdev2", 00:26:37.026 "uuid": "87de7b9d-cb00-56a0-8e7f-d8b4b4f33167", 00:26:37.026 "is_configured": true, 00:26:37.026 "data_offset": 2048, 00:26:37.026 "data_size": 63488 00:26:37.026 }, 00:26:37.026 { 00:26:37.026 "name": "BaseBdev3", 00:26:37.026 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:37.026 "is_configured": true, 00:26:37.026 "data_offset": 2048, 00:26:37.026 "data_size": 63488 00:26:37.026 }, 00:26:37.026 { 00:26:37.026 "name": "BaseBdev4", 00:26:37.026 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:37.026 "is_configured": true, 00:26:37.026 "data_offset": 2048, 00:26:37.026 "data_size": 63488 00:26:37.026 } 00:26:37.026 ] 00:26:37.026 }' 00:26:37.026 10:34:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.026 10:34:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:37.591 10:34:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:37.849 [2024-07-15 10:34:14.903734] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:37.849 10:34:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:37.849 [2024-07-15 10:34:14.951287] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d26ba0 00:26:37.849 [2024-07-15 10:34:14.953897] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:38.107 [2024-07-15 10:34:15.082333] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:38.107 [2024-07-15 10:34:15.083760] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:38.364 [2024-07-15 10:34:15.310678] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:38.622 [2024-07-15 10:34:15.657888] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:38.622 [2024-07-15 10:34:15.659183] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:26:38.879 [2024-07-15 10:34:15.861668] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:38.879 [2024-07-15 10:34:15.862317] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:38.879 10:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:38.879 10:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:38.879 10:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:38.879 10:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:38.879 10:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:38.879 10:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.879 10:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.136 [2024-07-15 10:34:16.218327] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:26:39.136 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:39.136 "name": "raid_bdev1", 00:26:39.136 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:39.136 "strip_size_kb": 0, 00:26:39.136 "state": "online", 00:26:39.136 "raid_level": "raid1", 00:26:39.136 "superblock": true, 00:26:39.136 "num_base_bdevs": 4, 00:26:39.136 "num_base_bdevs_discovered": 4, 00:26:39.136 "num_base_bdevs_operational": 4, 00:26:39.136 "process": { 00:26:39.136 "type": "rebuild", 00:26:39.136 "target": "spare", 00:26:39.136 "progress": { 00:26:39.136 "blocks": 12288, 00:26:39.136 "percent": 19 00:26:39.136 } 00:26:39.136 }, 00:26:39.136 "base_bdevs_list": [ 00:26:39.136 { 00:26:39.136 "name": "spare", 00:26:39.136 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:39.136 "is_configured": true, 00:26:39.136 "data_offset": 2048, 00:26:39.136 "data_size": 63488 00:26:39.136 }, 00:26:39.136 { 00:26:39.136 "name": "BaseBdev2", 00:26:39.136 "uuid": "87de7b9d-cb00-56a0-8e7f-d8b4b4f33167", 00:26:39.136 "is_configured": true, 00:26:39.136 "data_offset": 2048, 00:26:39.136 "data_size": 63488 00:26:39.136 }, 00:26:39.136 { 00:26:39.136 "name": "BaseBdev3", 00:26:39.136 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:39.136 "is_configured": true, 00:26:39.136 "data_offset": 2048, 00:26:39.136 "data_size": 63488 00:26:39.136 }, 00:26:39.136 { 00:26:39.136 "name": "BaseBdev4", 00:26:39.136 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:39.136 "is_configured": true, 00:26:39.136 "data_offset": 2048, 00:26:39.136 "data_size": 63488 00:26:39.136 } 00:26:39.136 ] 00:26:39.136 }' 00:26:39.136 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:39.136 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:39.136 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:39.136 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:39.136 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:39.393 [2024-07-15 10:34:16.453016] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:39.393 [2024-07-15 10:34:16.453328] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:39.393 [2024-07-15 10:34:16.548816] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:39.650 [2024-07-15 10:34:16.698763] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:39.650 [2024-07-15 10:34:16.721297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:39.650 [2024-07-15 10:34:16.721338] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:39.650 [2024-07-15 10:34:16.721350] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:39.650 [2024-07-15 10:34:16.753324] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1bf3670 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.650 10:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.907 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:39.907 "name": "raid_bdev1", 00:26:39.907 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:39.907 "strip_size_kb": 0, 00:26:39.907 "state": "online", 00:26:39.907 "raid_level": "raid1", 00:26:39.907 "superblock": true, 00:26:39.907 "num_base_bdevs": 4, 00:26:39.907 "num_base_bdevs_discovered": 3, 00:26:39.907 "num_base_bdevs_operational": 3, 00:26:39.907 "base_bdevs_list": [ 00:26:39.907 { 00:26:39.907 "name": null, 00:26:39.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:39.907 "is_configured": false, 00:26:39.907 "data_offset": 2048, 00:26:39.907 "data_size": 63488 00:26:39.907 }, 00:26:39.907 { 00:26:39.907 "name": "BaseBdev2", 00:26:39.907 "uuid": "87de7b9d-cb00-56a0-8e7f-d8b4b4f33167", 00:26:39.907 "is_configured": true, 00:26:39.907 "data_offset": 2048, 00:26:39.907 "data_size": 63488 00:26:39.907 }, 00:26:39.907 { 00:26:39.907 "name": "BaseBdev3", 00:26:39.907 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:39.907 "is_configured": true, 00:26:39.907 "data_offset": 2048, 00:26:39.907 "data_size": 63488 00:26:39.907 }, 00:26:39.907 { 00:26:39.907 "name": "BaseBdev4", 00:26:39.907 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:39.907 "is_configured": true, 00:26:39.907 "data_offset": 2048, 00:26:39.907 "data_size": 63488 00:26:39.907 } 00:26:39.907 ] 00:26:39.907 }' 00:26:39.907 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:39.907 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.838 "name": "raid_bdev1", 00:26:40.838 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:40.838 "strip_size_kb": 0, 00:26:40.838 "state": "online", 00:26:40.838 "raid_level": "raid1", 00:26:40.838 "superblock": true, 00:26:40.838 "num_base_bdevs": 4, 00:26:40.838 "num_base_bdevs_discovered": 3, 00:26:40.838 "num_base_bdevs_operational": 3, 00:26:40.838 "base_bdevs_list": [ 00:26:40.838 { 00:26:40.838 "name": null, 00:26:40.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.838 "is_configured": false, 00:26:40.838 "data_offset": 2048, 00:26:40.838 "data_size": 63488 00:26:40.838 }, 00:26:40.838 { 00:26:40.838 "name": "BaseBdev2", 00:26:40.838 "uuid": "87de7b9d-cb00-56a0-8e7f-d8b4b4f33167", 00:26:40.838 "is_configured": true, 00:26:40.838 "data_offset": 2048, 00:26:40.838 "data_size": 63488 00:26:40.838 }, 00:26:40.838 { 00:26:40.838 "name": "BaseBdev3", 00:26:40.838 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:40.838 "is_configured": true, 00:26:40.838 "data_offset": 2048, 00:26:40.838 "data_size": 63488 00:26:40.838 }, 00:26:40.838 { 00:26:40.838 "name": "BaseBdev4", 00:26:40.838 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:40.838 "is_configured": true, 00:26:40.838 "data_offset": 2048, 00:26:40.838 "data_size": 63488 00:26:40.838 } 00:26:40.838 ] 00:26:40.838 }' 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:40.838 10:34:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.096 10:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:41.096 10:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:41.096 [2024-07-15 10:34:18.278662] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:41.354 10:34:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:41.354 [2024-07-15 10:34:18.345469] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf2d60 00:26:41.354 [2024-07-15 10:34:18.347035] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:41.354 [2024-07-15 10:34:18.468622] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:41.354 [2024-07-15 10:34:18.468933] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:26:41.611 [2024-07-15 10:34:18.582122] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:41.611 [2024-07-15 10:34:18.582398] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:26:41.868 [2024-07-15 10:34:18.961335] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:26:42.125 [2024-07-15 10:34:19.298707] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:26:42.382 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:42.382 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:42.382 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:42.382 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:42.382 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:42.382 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.382 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:42.639 "name": "raid_bdev1", 00:26:42.639 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:42.639 "strip_size_kb": 0, 00:26:42.639 "state": "online", 00:26:42.639 "raid_level": "raid1", 00:26:42.639 "superblock": true, 00:26:42.639 "num_base_bdevs": 4, 00:26:42.639 "num_base_bdevs_discovered": 4, 00:26:42.639 "num_base_bdevs_operational": 4, 00:26:42.639 "process": { 00:26:42.639 "type": "rebuild", 00:26:42.639 "target": "spare", 00:26:42.639 "progress": { 00:26:42.639 "blocks": 18432, 00:26:42.639 "percent": 29 00:26:42.639 } 00:26:42.639 }, 00:26:42.639 "base_bdevs_list": [ 00:26:42.639 { 00:26:42.639 "name": "spare", 00:26:42.639 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:42.639 "is_configured": true, 00:26:42.639 "data_offset": 2048, 00:26:42.639 "data_size": 63488 00:26:42.639 }, 00:26:42.639 { 00:26:42.639 "name": "BaseBdev2", 00:26:42.639 "uuid": "87de7b9d-cb00-56a0-8e7f-d8b4b4f33167", 00:26:42.639 "is_configured": true, 00:26:42.639 "data_offset": 2048, 00:26:42.639 "data_size": 63488 00:26:42.639 }, 00:26:42.639 { 00:26:42.639 "name": "BaseBdev3", 00:26:42.639 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:42.639 "is_configured": true, 00:26:42.639 "data_offset": 2048, 00:26:42.639 "data_size": 63488 00:26:42.639 }, 00:26:42.639 { 00:26:42.639 "name": "BaseBdev4", 00:26:42.639 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:42.639 "is_configured": true, 00:26:42.639 "data_offset": 2048, 00:26:42.639 "data_size": 63488 00:26:42.639 } 00:26:42.639 ] 00:26:42.639 }' 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:42.639 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:42.639 10:34:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:42.639 [2024-07-15 10:34:19.776972] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:26:42.897 [2024-07-15 10:34:19.921251] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:43.154 [2024-07-15 10:34:20.284858] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1bf3670 00:26:43.154 [2024-07-15 10:34:20.284897] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1bf2d60 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.154 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.410 [2024-07-15 10:34:20.419862] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:26:43.411 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:43.411 "name": "raid_bdev1", 00:26:43.411 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:43.411 "strip_size_kb": 0, 00:26:43.411 "state": "online", 00:26:43.411 "raid_level": "raid1", 00:26:43.411 "superblock": true, 00:26:43.411 "num_base_bdevs": 4, 00:26:43.411 "num_base_bdevs_discovered": 3, 00:26:43.411 "num_base_bdevs_operational": 3, 00:26:43.411 "process": { 00:26:43.411 "type": "rebuild", 00:26:43.411 "target": "spare", 00:26:43.411 "progress": { 00:26:43.411 "blocks": 28672, 00:26:43.411 "percent": 45 00:26:43.411 } 00:26:43.411 }, 00:26:43.411 "base_bdevs_list": [ 00:26:43.411 { 00:26:43.411 "name": "spare", 00:26:43.411 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:43.411 "is_configured": true, 00:26:43.411 "data_offset": 2048, 00:26:43.411 "data_size": 63488 00:26:43.411 }, 00:26:43.411 { 00:26:43.411 "name": null, 00:26:43.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:43.411 "is_configured": false, 00:26:43.411 "data_offset": 2048, 00:26:43.411 "data_size": 63488 00:26:43.411 }, 00:26:43.411 { 00:26:43.411 "name": "BaseBdev3", 00:26:43.411 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:43.411 "is_configured": true, 00:26:43.411 "data_offset": 2048, 00:26:43.411 "data_size": 63488 00:26:43.411 }, 00:26:43.411 { 00:26:43.411 "name": "BaseBdev4", 00:26:43.411 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:43.411 "is_configured": true, 00:26:43.411 "data_offset": 2048, 00:26:43.411 "data_size": 63488 00:26:43.411 } 00:26:43.411 ] 00:26:43.411 }' 00:26:43.411 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:43.411 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=945 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.667 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.667 [2024-07-15 10:34:20.789478] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:43.667 [2024-07-15 10:34:20.790482] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:26:43.924 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:43.924 "name": "raid_bdev1", 00:26:43.924 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:43.924 "strip_size_kb": 0, 00:26:43.924 "state": "online", 00:26:43.924 "raid_level": "raid1", 00:26:43.924 "superblock": true, 00:26:43.924 "num_base_bdevs": 4, 00:26:43.924 "num_base_bdevs_discovered": 3, 00:26:43.924 "num_base_bdevs_operational": 3, 00:26:43.924 "process": { 00:26:43.924 "type": "rebuild", 00:26:43.924 "target": "spare", 00:26:43.924 "progress": { 00:26:43.924 "blocks": 32768, 00:26:43.924 "percent": 51 00:26:43.924 } 00:26:43.924 }, 00:26:43.924 "base_bdevs_list": [ 00:26:43.924 { 00:26:43.924 "name": "spare", 00:26:43.924 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:43.924 "is_configured": true, 00:26:43.924 "data_offset": 2048, 00:26:43.924 "data_size": 63488 00:26:43.924 }, 00:26:43.924 { 00:26:43.924 "name": null, 00:26:43.924 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:43.924 "is_configured": false, 00:26:43.924 "data_offset": 2048, 00:26:43.924 "data_size": 63488 00:26:43.924 }, 00:26:43.924 { 00:26:43.924 "name": "BaseBdev3", 00:26:43.924 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:43.924 "is_configured": true, 00:26:43.924 "data_offset": 2048, 00:26:43.924 "data_size": 63488 00:26:43.924 }, 00:26:43.924 { 00:26:43.924 "name": "BaseBdev4", 00:26:43.924 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:43.924 "is_configured": true, 00:26:43.924 "data_offset": 2048, 00:26:43.924 "data_size": 63488 00:26:43.924 } 00:26:43.924 ] 00:26:43.924 }' 00:26:43.924 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:43.924 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:43.924 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:43.924 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:43.924 10:34:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:43.924 [2024-07-15 10:34:21.004936] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:44.180 [2024-07-15 10:34:21.348806] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:44.180 [2024-07-15 10:34:21.349116] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:26:44.438 [2024-07-15 10:34:21.552872] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.002 10:34:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.259 10:34:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:45.259 "name": "raid_bdev1", 00:26:45.259 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:45.259 "strip_size_kb": 0, 00:26:45.259 "state": "online", 00:26:45.259 "raid_level": "raid1", 00:26:45.259 "superblock": true, 00:26:45.259 "num_base_bdevs": 4, 00:26:45.259 "num_base_bdevs_discovered": 3, 00:26:45.259 "num_base_bdevs_operational": 3, 00:26:45.259 "process": { 00:26:45.259 "type": "rebuild", 00:26:45.259 "target": "spare", 00:26:45.259 "progress": { 00:26:45.259 "blocks": 51200, 00:26:45.259 "percent": 80 00:26:45.259 } 00:26:45.259 }, 00:26:45.259 "base_bdevs_list": [ 00:26:45.259 { 00:26:45.259 "name": "spare", 00:26:45.259 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:45.259 "is_configured": true, 00:26:45.259 "data_offset": 2048, 00:26:45.259 "data_size": 63488 00:26:45.259 }, 00:26:45.259 { 00:26:45.259 "name": null, 00:26:45.259 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.259 "is_configured": false, 00:26:45.259 "data_offset": 2048, 00:26:45.259 "data_size": 63488 00:26:45.259 }, 00:26:45.259 { 00:26:45.259 "name": "BaseBdev3", 00:26:45.259 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:45.259 "is_configured": true, 00:26:45.259 "data_offset": 2048, 00:26:45.259 "data_size": 63488 00:26:45.259 }, 00:26:45.259 { 00:26:45.259 "name": "BaseBdev4", 00:26:45.259 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:45.259 "is_configured": true, 00:26:45.259 "data_offset": 2048, 00:26:45.259 "data_size": 63488 00:26:45.259 } 00:26:45.259 ] 00:26:45.259 }' 00:26:45.259 10:34:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.259 [2024-07-15 10:34:22.249135] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:45.259 10:34:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:45.259 10:34:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.259 10:34:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:45.259 10:34:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:45.516 [2024-07-15 10:34:22.583139] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:26:46.080 [2024-07-15 10:34:23.041049] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:46.080 [2024-07-15 10:34:23.150479] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:46.080 [2024-07-15 10:34:23.152813] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.335 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:46.592 "name": "raid_bdev1", 00:26:46.592 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:46.592 "strip_size_kb": 0, 00:26:46.592 "state": "online", 00:26:46.592 "raid_level": "raid1", 00:26:46.592 "superblock": true, 00:26:46.592 "num_base_bdevs": 4, 00:26:46.592 "num_base_bdevs_discovered": 3, 00:26:46.592 "num_base_bdevs_operational": 3, 00:26:46.592 "base_bdevs_list": [ 00:26:46.592 { 00:26:46.592 "name": "spare", 00:26:46.592 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:46.592 "is_configured": true, 00:26:46.592 "data_offset": 2048, 00:26:46.592 "data_size": 63488 00:26:46.592 }, 00:26:46.592 { 00:26:46.592 "name": null, 00:26:46.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.592 "is_configured": false, 00:26:46.592 "data_offset": 2048, 00:26:46.592 "data_size": 63488 00:26:46.592 }, 00:26:46.592 { 00:26:46.592 "name": "BaseBdev3", 00:26:46.592 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:46.592 "is_configured": true, 00:26:46.592 "data_offset": 2048, 00:26:46.592 "data_size": 63488 00:26:46.592 }, 00:26:46.592 { 00:26:46.592 "name": "BaseBdev4", 00:26:46.592 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:46.592 "is_configured": true, 00:26:46.592 "data_offset": 2048, 00:26:46.592 "data_size": 63488 00:26:46.592 } 00:26:46.592 ] 00:26:46.592 }' 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.592 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.849 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:46.849 "name": "raid_bdev1", 00:26:46.849 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:46.849 "strip_size_kb": 0, 00:26:46.849 "state": "online", 00:26:46.849 "raid_level": "raid1", 00:26:46.849 "superblock": true, 00:26:46.849 "num_base_bdevs": 4, 00:26:46.850 "num_base_bdevs_discovered": 3, 00:26:46.850 "num_base_bdevs_operational": 3, 00:26:46.850 "base_bdevs_list": [ 00:26:46.850 { 00:26:46.850 "name": "spare", 00:26:46.850 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:46.850 "is_configured": true, 00:26:46.850 "data_offset": 2048, 00:26:46.850 "data_size": 63488 00:26:46.850 }, 00:26:46.850 { 00:26:46.850 "name": null, 00:26:46.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:46.850 "is_configured": false, 00:26:46.850 "data_offset": 2048, 00:26:46.850 "data_size": 63488 00:26:46.850 }, 00:26:46.850 { 00:26:46.850 "name": "BaseBdev3", 00:26:46.850 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:46.850 "is_configured": true, 00:26:46.850 "data_offset": 2048, 00:26:46.850 "data_size": 63488 00:26:46.850 }, 00:26:46.850 { 00:26:46.850 "name": "BaseBdev4", 00:26:46.850 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:46.850 "is_configured": true, 00:26:46.850 "data_offset": 2048, 00:26:46.850 "data_size": 63488 00:26:46.850 } 00:26:46.850 ] 00:26:46.850 }' 00:26:46.850 10:34:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.850 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.108 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:47.108 "name": "raid_bdev1", 00:26:47.108 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:47.108 "strip_size_kb": 0, 00:26:47.108 "state": "online", 00:26:47.108 "raid_level": "raid1", 00:26:47.108 "superblock": true, 00:26:47.108 "num_base_bdevs": 4, 00:26:47.108 "num_base_bdevs_discovered": 3, 00:26:47.108 "num_base_bdevs_operational": 3, 00:26:47.108 "base_bdevs_list": [ 00:26:47.108 { 00:26:47.108 "name": "spare", 00:26:47.108 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:47.108 "is_configured": true, 00:26:47.108 "data_offset": 2048, 00:26:47.108 "data_size": 63488 00:26:47.108 }, 00:26:47.108 { 00:26:47.108 "name": null, 00:26:47.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:47.108 "is_configured": false, 00:26:47.108 "data_offset": 2048, 00:26:47.108 "data_size": 63488 00:26:47.108 }, 00:26:47.108 { 00:26:47.108 "name": "BaseBdev3", 00:26:47.108 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:47.108 "is_configured": true, 00:26:47.108 "data_offset": 2048, 00:26:47.108 "data_size": 63488 00:26:47.108 }, 00:26:47.108 { 00:26:47.108 "name": "BaseBdev4", 00:26:47.108 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:47.108 "is_configured": true, 00:26:47.108 "data_offset": 2048, 00:26:47.108 "data_size": 63488 00:26:47.108 } 00:26:47.108 ] 00:26:47.108 }' 00:26:47.108 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:47.108 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:47.674 10:34:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:47.932 [2024-07-15 10:34:25.061236] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:47.932 [2024-07-15 10:34:25.061275] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:48.189 00:26:48.189 Latency(us) 00:26:48.189 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:48.190 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:48.190 raid_bdev1 : 11.55 85.11 255.33 0.00 0.00 15939.43 299.19 113519.75 00:26:48.190 =================================================================================================================== 00:26:48.190 Total : 85.11 255.33 0.00 0.00 15939.43 299.19 113519.75 00:26:48.190 [2024-07-15 10:34:25.153448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:48.190 [2024-07-15 10:34:25.153482] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:48.190 [2024-07-15 10:34:25.153574] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:48.190 [2024-07-15 10:34:25.153594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d248a0 name raid_bdev1, state offline 00:26:48.190 0 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.190 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:48.754 /dev/nbd0 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:48.754 1+0 records in 00:26:48.754 1+0 records out 00:26:48.754 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257737 s, 15.9 MB/s 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:48.754 10:34:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:26:49.010 /dev/nbd1 00:26:49.010 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:49.010 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:49.010 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:49.010 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:49.010 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:49.010 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:49.010 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:49.011 1+0 records in 00:26:49.011 1+0 records out 00:26:49.011 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294662 s, 13.9 MB/s 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:49.011 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:49.266 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:26:49.523 /dev/nbd1 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:49.523 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:49.524 1+0 records in 00:26:49.524 1+0 records out 00:26:49.524 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279908 s, 14.6 MB/s 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:49.524 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:49.801 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:49.801 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:49.801 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:49.801 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:49.801 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:49.801 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:49.801 10:34:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:50.067 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:50.324 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:50.580 [2024-07-15 10:34:27.704535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:50.580 [2024-07-15 10:34:27.704586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:50.580 [2024-07-15 10:34:27.704608] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d26380 00:26:50.580 [2024-07-15 10:34:27.704621] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:50.580 [2024-07-15 10:34:27.706279] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:50.580 [2024-07-15 10:34:27.706309] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:50.580 [2024-07-15 10:34:27.706402] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:50.580 [2024-07-15 10:34:27.706432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:50.580 [2024-07-15 10:34:27.706541] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:50.580 [2024-07-15 10:34:27.706616] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:50.580 spare 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.580 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.849 [2024-07-15 10:34:27.806943] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bf2c80 00:26:50.849 [2024-07-15 10:34:27.806969] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:50.849 [2024-07-15 10:34:27.807186] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18fafa0 00:26:50.849 [2024-07-15 10:34:27.807357] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bf2c80 00:26:50.849 [2024-07-15 10:34:27.807368] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bf2c80 00:26:50.849 [2024-07-15 10:34:27.807482] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:50.849 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.849 "name": "raid_bdev1", 00:26:50.849 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:50.849 "strip_size_kb": 0, 00:26:50.849 "state": "online", 00:26:50.849 "raid_level": "raid1", 00:26:50.849 "superblock": true, 00:26:50.849 "num_base_bdevs": 4, 00:26:50.849 "num_base_bdevs_discovered": 3, 00:26:50.849 "num_base_bdevs_operational": 3, 00:26:50.849 "base_bdevs_list": [ 00:26:50.849 { 00:26:50.849 "name": "spare", 00:26:50.849 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:50.849 "is_configured": true, 00:26:50.849 "data_offset": 2048, 00:26:50.849 "data_size": 63488 00:26:50.849 }, 00:26:50.849 { 00:26:50.849 "name": null, 00:26:50.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.849 "is_configured": false, 00:26:50.849 "data_offset": 2048, 00:26:50.849 "data_size": 63488 00:26:50.849 }, 00:26:50.849 { 00:26:50.849 "name": "BaseBdev3", 00:26:50.849 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:50.849 "is_configured": true, 00:26:50.849 "data_offset": 2048, 00:26:50.849 "data_size": 63488 00:26:50.849 }, 00:26:50.849 { 00:26:50.849 "name": "BaseBdev4", 00:26:50.849 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:50.849 "is_configured": true, 00:26:50.849 "data_offset": 2048, 00:26:50.849 "data_size": 63488 00:26:50.849 } 00:26:50.849 ] 00:26:50.849 }' 00:26:50.849 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.849 10:34:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:51.412 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:51.412 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.412 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:51.412 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:51.412 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.412 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.412 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.668 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.668 "name": "raid_bdev1", 00:26:51.668 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:51.668 "strip_size_kb": 0, 00:26:51.668 "state": "online", 00:26:51.668 "raid_level": "raid1", 00:26:51.668 "superblock": true, 00:26:51.668 "num_base_bdevs": 4, 00:26:51.668 "num_base_bdevs_discovered": 3, 00:26:51.668 "num_base_bdevs_operational": 3, 00:26:51.668 "base_bdevs_list": [ 00:26:51.668 { 00:26:51.668 "name": "spare", 00:26:51.668 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:51.668 "is_configured": true, 00:26:51.668 "data_offset": 2048, 00:26:51.668 "data_size": 63488 00:26:51.668 }, 00:26:51.668 { 00:26:51.668 "name": null, 00:26:51.668 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.668 "is_configured": false, 00:26:51.668 "data_offset": 2048, 00:26:51.668 "data_size": 63488 00:26:51.668 }, 00:26:51.668 { 00:26:51.669 "name": "BaseBdev3", 00:26:51.669 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:51.669 "is_configured": true, 00:26:51.669 "data_offset": 2048, 00:26:51.669 "data_size": 63488 00:26:51.669 }, 00:26:51.669 { 00:26:51.669 "name": "BaseBdev4", 00:26:51.669 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:51.669 "is_configured": true, 00:26:51.669 "data_offset": 2048, 00:26:51.669 "data_size": 63488 00:26:51.669 } 00:26:51.669 ] 00:26:51.669 }' 00:26:51.669 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.669 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:51.669 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.925 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:51.925 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.925 10:34:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:52.181 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:52.181 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:52.181 [2024-07-15 10:34:29.369277] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:52.437 "name": "raid_bdev1", 00:26:52.437 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:52.437 "strip_size_kb": 0, 00:26:52.437 "state": "online", 00:26:52.437 "raid_level": "raid1", 00:26:52.437 "superblock": true, 00:26:52.437 "num_base_bdevs": 4, 00:26:52.437 "num_base_bdevs_discovered": 2, 00:26:52.437 "num_base_bdevs_operational": 2, 00:26:52.437 "base_bdevs_list": [ 00:26:52.437 { 00:26:52.437 "name": null, 00:26:52.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.437 "is_configured": false, 00:26:52.437 "data_offset": 2048, 00:26:52.437 "data_size": 63488 00:26:52.437 }, 00:26:52.437 { 00:26:52.437 "name": null, 00:26:52.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:52.437 "is_configured": false, 00:26:52.437 "data_offset": 2048, 00:26:52.437 "data_size": 63488 00:26:52.437 }, 00:26:52.437 { 00:26:52.437 "name": "BaseBdev3", 00:26:52.437 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:52.437 "is_configured": true, 00:26:52.437 "data_offset": 2048, 00:26:52.437 "data_size": 63488 00:26:52.437 }, 00:26:52.437 { 00:26:52.437 "name": "BaseBdev4", 00:26:52.437 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:52.437 "is_configured": true, 00:26:52.437 "data_offset": 2048, 00:26:52.437 "data_size": 63488 00:26:52.437 } 00:26:52.437 ] 00:26:52.437 }' 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:52.437 10:34:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:53.365 10:34:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:53.365 [2024-07-15 10:34:30.444302] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:53.365 [2024-07-15 10:34:30.444470] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:53.365 [2024-07-15 10:34:30.444486] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:53.365 [2024-07-15 10:34:30.444516] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:53.365 [2024-07-15 10:34:30.449021] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d99b30 00:26:53.365 [2024-07-15 10:34:30.451285] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:53.365 10:34:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:54.293 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:54.293 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:54.293 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:54.293 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:54.293 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:54.293 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.293 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.550 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:54.550 "name": "raid_bdev1", 00:26:54.550 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:54.550 "strip_size_kb": 0, 00:26:54.550 "state": "online", 00:26:54.550 "raid_level": "raid1", 00:26:54.550 "superblock": true, 00:26:54.550 "num_base_bdevs": 4, 00:26:54.550 "num_base_bdevs_discovered": 3, 00:26:54.550 "num_base_bdevs_operational": 3, 00:26:54.550 "process": { 00:26:54.550 "type": "rebuild", 00:26:54.550 "target": "spare", 00:26:54.550 "progress": { 00:26:54.550 "blocks": 22528, 00:26:54.550 "percent": 35 00:26:54.550 } 00:26:54.550 }, 00:26:54.550 "base_bdevs_list": [ 00:26:54.550 { 00:26:54.550 "name": "spare", 00:26:54.550 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:54.550 "is_configured": true, 00:26:54.550 "data_offset": 2048, 00:26:54.550 "data_size": 63488 00:26:54.550 }, 00:26:54.550 { 00:26:54.550 "name": null, 00:26:54.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.550 "is_configured": false, 00:26:54.550 "data_offset": 2048, 00:26:54.550 "data_size": 63488 00:26:54.550 }, 00:26:54.550 { 00:26:54.550 "name": "BaseBdev3", 00:26:54.550 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:54.550 "is_configured": true, 00:26:54.550 "data_offset": 2048, 00:26:54.550 "data_size": 63488 00:26:54.550 }, 00:26:54.550 { 00:26:54.550 "name": "BaseBdev4", 00:26:54.550 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:54.550 "is_configured": true, 00:26:54.550 "data_offset": 2048, 00:26:54.550 "data_size": 63488 00:26:54.550 } 00:26:54.550 ] 00:26:54.550 }' 00:26:54.550 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:54.550 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:54.550 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:54.550 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:54.550 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:54.807 [2024-07-15 10:34:31.961465] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:54.807 [2024-07-15 10:34:31.963191] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:54.807 [2024-07-15 10:34:31.963235] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:54.807 [2024-07-15 10:34:31.963252] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:54.807 [2024-07-15 10:34:31.963260] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:54.807 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:54.807 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.808 10:34:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.065 10:34:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.065 "name": "raid_bdev1", 00:26:55.065 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:55.065 "strip_size_kb": 0, 00:26:55.065 "state": "online", 00:26:55.065 "raid_level": "raid1", 00:26:55.065 "superblock": true, 00:26:55.065 "num_base_bdevs": 4, 00:26:55.065 "num_base_bdevs_discovered": 2, 00:26:55.065 "num_base_bdevs_operational": 2, 00:26:55.065 "base_bdevs_list": [ 00:26:55.065 { 00:26:55.065 "name": null, 00:26:55.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.065 "is_configured": false, 00:26:55.065 "data_offset": 2048, 00:26:55.065 "data_size": 63488 00:26:55.065 }, 00:26:55.065 { 00:26:55.065 "name": null, 00:26:55.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.065 "is_configured": false, 00:26:55.065 "data_offset": 2048, 00:26:55.065 "data_size": 63488 00:26:55.065 }, 00:26:55.065 { 00:26:55.065 "name": "BaseBdev3", 00:26:55.065 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:55.065 "is_configured": true, 00:26:55.065 "data_offset": 2048, 00:26:55.065 "data_size": 63488 00:26:55.065 }, 00:26:55.065 { 00:26:55.065 "name": "BaseBdev4", 00:26:55.065 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:55.065 "is_configured": true, 00:26:55.065 "data_offset": 2048, 00:26:55.065 "data_size": 63488 00:26:55.065 } 00:26:55.065 ] 00:26:55.065 }' 00:26:55.065 10:34:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.065 10:34:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:55.994 10:34:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:55.994 [2024-07-15 10:34:33.074580] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:55.994 [2024-07-15 10:34:33.074639] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:55.994 [2024-07-15 10:34:33.074661] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d281b0 00:26:55.994 [2024-07-15 10:34:33.074674] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:55.994 [2024-07-15 10:34:33.075070] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:55.994 [2024-07-15 10:34:33.075090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:55.994 [2024-07-15 10:34:33.075174] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:55.994 [2024-07-15 10:34:33.075187] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:26:55.994 [2024-07-15 10:34:33.075198] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:55.994 [2024-07-15 10:34:33.075217] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:55.994 [2024-07-15 10:34:33.079685] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18cdfc0 00:26:55.994 spare 00:26:55.994 [2024-07-15 10:34:33.081084] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:55.994 10:34:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:56.923 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:56.923 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:56.923 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:56.923 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:56.923 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:56.923 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.923 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.179 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.179 "name": "raid_bdev1", 00:26:57.179 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:57.179 "strip_size_kb": 0, 00:26:57.179 "state": "online", 00:26:57.179 "raid_level": "raid1", 00:26:57.179 "superblock": true, 00:26:57.179 "num_base_bdevs": 4, 00:26:57.179 "num_base_bdevs_discovered": 3, 00:26:57.179 "num_base_bdevs_operational": 3, 00:26:57.179 "process": { 00:26:57.179 "type": "rebuild", 00:26:57.179 "target": "spare", 00:26:57.179 "progress": { 00:26:57.179 "blocks": 24576, 00:26:57.179 "percent": 38 00:26:57.179 } 00:26:57.179 }, 00:26:57.179 "base_bdevs_list": [ 00:26:57.179 { 00:26:57.179 "name": "spare", 00:26:57.179 "uuid": "b7da9713-d4dc-5ace-97c9-9b871a5477f5", 00:26:57.179 "is_configured": true, 00:26:57.179 "data_offset": 2048, 00:26:57.179 "data_size": 63488 00:26:57.179 }, 00:26:57.179 { 00:26:57.179 "name": null, 00:26:57.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.179 "is_configured": false, 00:26:57.179 "data_offset": 2048, 00:26:57.179 "data_size": 63488 00:26:57.179 }, 00:26:57.179 { 00:26:57.179 "name": "BaseBdev3", 00:26:57.179 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:57.179 "is_configured": true, 00:26:57.179 "data_offset": 2048, 00:26:57.179 "data_size": 63488 00:26:57.179 }, 00:26:57.179 { 00:26:57.179 "name": "BaseBdev4", 00:26:57.179 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:57.179 "is_configured": true, 00:26:57.179 "data_offset": 2048, 00:26:57.179 "data_size": 63488 00:26:57.179 } 00:26:57.179 ] 00:26:57.179 }' 00:26:57.179 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.436 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:57.436 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.436 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:57.436 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:57.694 [2024-07-15 10:34:34.672529] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.694 [2024-07-15 10:34:34.693519] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:57.694 [2024-07-15 10:34:34.693565] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.694 [2024-07-15 10:34:34.693582] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.694 [2024-07-15 10:34:34.693590] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.694 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.951 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.951 "name": "raid_bdev1", 00:26:57.951 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:57.951 "strip_size_kb": 0, 00:26:57.951 "state": "online", 00:26:57.951 "raid_level": "raid1", 00:26:57.951 "superblock": true, 00:26:57.951 "num_base_bdevs": 4, 00:26:57.951 "num_base_bdevs_discovered": 2, 00:26:57.951 "num_base_bdevs_operational": 2, 00:26:57.951 "base_bdevs_list": [ 00:26:57.951 { 00:26:57.951 "name": null, 00:26:57.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.952 "is_configured": false, 00:26:57.952 "data_offset": 2048, 00:26:57.952 "data_size": 63488 00:26:57.952 }, 00:26:57.952 { 00:26:57.952 "name": null, 00:26:57.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.952 "is_configured": false, 00:26:57.952 "data_offset": 2048, 00:26:57.952 "data_size": 63488 00:26:57.952 }, 00:26:57.952 { 00:26:57.952 "name": "BaseBdev3", 00:26:57.952 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:57.952 "is_configured": true, 00:26:57.952 "data_offset": 2048, 00:26:57.952 "data_size": 63488 00:26:57.952 }, 00:26:57.952 { 00:26:57.952 "name": "BaseBdev4", 00:26:57.952 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:57.952 "is_configured": true, 00:26:57.952 "data_offset": 2048, 00:26:57.952 "data_size": 63488 00:26:57.952 } 00:26:57.952 ] 00:26:57.952 }' 00:26:57.952 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.952 10:34:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:58.518 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:58.518 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:58.518 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:58.518 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:58.518 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:58.518 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.518 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.776 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:58.776 "name": "raid_bdev1", 00:26:58.776 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:26:58.776 "strip_size_kb": 0, 00:26:58.776 "state": "online", 00:26:58.776 "raid_level": "raid1", 00:26:58.776 "superblock": true, 00:26:58.776 "num_base_bdevs": 4, 00:26:58.776 "num_base_bdevs_discovered": 2, 00:26:58.776 "num_base_bdevs_operational": 2, 00:26:58.776 "base_bdevs_list": [ 00:26:58.776 { 00:26:58.776 "name": null, 00:26:58.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.776 "is_configured": false, 00:26:58.776 "data_offset": 2048, 00:26:58.776 "data_size": 63488 00:26:58.776 }, 00:26:58.776 { 00:26:58.776 "name": null, 00:26:58.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.776 "is_configured": false, 00:26:58.776 "data_offset": 2048, 00:26:58.776 "data_size": 63488 00:26:58.776 }, 00:26:58.776 { 00:26:58.776 "name": "BaseBdev3", 00:26:58.776 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:26:58.776 "is_configured": true, 00:26:58.776 "data_offset": 2048, 00:26:58.776 "data_size": 63488 00:26:58.776 }, 00:26:58.776 { 00:26:58.776 "name": "BaseBdev4", 00:26:58.776 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:26:58.776 "is_configured": true, 00:26:58.776 "data_offset": 2048, 00:26:58.776 "data_size": 63488 00:26:58.776 } 00:26:58.776 ] 00:26:58.776 }' 00:26:58.776 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:58.776 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:58.776 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:58.776 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:58.776 10:34:35 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:59.034 10:34:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:59.293 [2024-07-15 10:34:36.314995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:59.293 [2024-07-15 10:34:36.315051] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:59.293 [2024-07-15 10:34:36.315073] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d27d90 00:26:59.293 [2024-07-15 10:34:36.315086] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:59.293 [2024-07-15 10:34:36.315440] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:59.293 [2024-07-15 10:34:36.315459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:59.293 [2024-07-15 10:34:36.315526] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:59.293 [2024-07-15 10:34:36.315538] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:59.293 [2024-07-15 10:34:36.315550] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:59.293 BaseBdev1 00:26:59.293 10:34:36 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.226 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:00.484 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:00.484 "name": "raid_bdev1", 00:27:00.484 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:27:00.484 "strip_size_kb": 0, 00:27:00.484 "state": "online", 00:27:00.484 "raid_level": "raid1", 00:27:00.484 "superblock": true, 00:27:00.484 "num_base_bdevs": 4, 00:27:00.484 "num_base_bdevs_discovered": 2, 00:27:00.484 "num_base_bdevs_operational": 2, 00:27:00.484 "base_bdevs_list": [ 00:27:00.484 { 00:27:00.484 "name": null, 00:27:00.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.484 "is_configured": false, 00:27:00.484 "data_offset": 2048, 00:27:00.484 "data_size": 63488 00:27:00.484 }, 00:27:00.484 { 00:27:00.484 "name": null, 00:27:00.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.484 "is_configured": false, 00:27:00.484 "data_offset": 2048, 00:27:00.484 "data_size": 63488 00:27:00.484 }, 00:27:00.484 { 00:27:00.484 "name": "BaseBdev3", 00:27:00.484 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:27:00.484 "is_configured": true, 00:27:00.484 "data_offset": 2048, 00:27:00.484 "data_size": 63488 00:27:00.484 }, 00:27:00.484 { 00:27:00.484 "name": "BaseBdev4", 00:27:00.484 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:27:00.484 "is_configured": true, 00:27:00.484 "data_offset": 2048, 00:27:00.484 "data_size": 63488 00:27:00.484 } 00:27:00.484 ] 00:27:00.484 }' 00:27:00.484 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:00.484 10:34:37 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:01.049 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:01.049 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:01.049 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:01.049 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:01.049 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:01.049 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.049 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.307 "name": "raid_bdev1", 00:27:01.307 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:27:01.307 "strip_size_kb": 0, 00:27:01.307 "state": "online", 00:27:01.307 "raid_level": "raid1", 00:27:01.307 "superblock": true, 00:27:01.307 "num_base_bdevs": 4, 00:27:01.307 "num_base_bdevs_discovered": 2, 00:27:01.307 "num_base_bdevs_operational": 2, 00:27:01.307 "base_bdevs_list": [ 00:27:01.307 { 00:27:01.307 "name": null, 00:27:01.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.307 "is_configured": false, 00:27:01.307 "data_offset": 2048, 00:27:01.307 "data_size": 63488 00:27:01.307 }, 00:27:01.307 { 00:27:01.307 "name": null, 00:27:01.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.307 "is_configured": false, 00:27:01.307 "data_offset": 2048, 00:27:01.307 "data_size": 63488 00:27:01.307 }, 00:27:01.307 { 00:27:01.307 "name": "BaseBdev3", 00:27:01.307 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:27:01.307 "is_configured": true, 00:27:01.307 "data_offset": 2048, 00:27:01.307 "data_size": 63488 00:27:01.307 }, 00:27:01.307 { 00:27:01.307 "name": "BaseBdev4", 00:27:01.307 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:27:01.307 "is_configured": true, 00:27:01.307 "data_offset": 2048, 00:27:01.307 "data_size": 63488 00:27:01.307 } 00:27:01.307 ] 00:27:01.307 }' 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:01.307 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:01.565 [2024-07-15 10:34:38.702076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:01.565 [2024-07-15 10:34:38.702221] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:01.565 [2024-07-15 10:34:38.702236] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:01.565 request: 00:27:01.565 { 00:27:01.565 "base_bdev": "BaseBdev1", 00:27:01.565 "raid_bdev": "raid_bdev1", 00:27:01.565 "method": "bdev_raid_add_base_bdev", 00:27:01.565 "req_id": 1 00:27:01.565 } 00:27:01.565 Got JSON-RPC error response 00:27:01.565 response: 00:27:01.565 { 00:27:01.565 "code": -22, 00:27:01.565 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:01.565 } 00:27:01.565 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:27:01.565 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:01.565 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:01.565 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:01.565 10:34:38 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:02.938 "name": "raid_bdev1", 00:27:02.938 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:27:02.938 "strip_size_kb": 0, 00:27:02.938 "state": "online", 00:27:02.938 "raid_level": "raid1", 00:27:02.938 "superblock": true, 00:27:02.938 "num_base_bdevs": 4, 00:27:02.938 "num_base_bdevs_discovered": 2, 00:27:02.938 "num_base_bdevs_operational": 2, 00:27:02.938 "base_bdevs_list": [ 00:27:02.938 { 00:27:02.938 "name": null, 00:27:02.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.938 "is_configured": false, 00:27:02.938 "data_offset": 2048, 00:27:02.938 "data_size": 63488 00:27:02.938 }, 00:27:02.938 { 00:27:02.938 "name": null, 00:27:02.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.938 "is_configured": false, 00:27:02.938 "data_offset": 2048, 00:27:02.938 "data_size": 63488 00:27:02.938 }, 00:27:02.938 { 00:27:02.938 "name": "BaseBdev3", 00:27:02.938 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:27:02.938 "is_configured": true, 00:27:02.938 "data_offset": 2048, 00:27:02.938 "data_size": 63488 00:27:02.938 }, 00:27:02.938 { 00:27:02.938 "name": "BaseBdev4", 00:27:02.938 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:27:02.938 "is_configured": true, 00:27:02.938 "data_offset": 2048, 00:27:02.938 "data_size": 63488 00:27:02.938 } 00:27:02.938 ] 00:27:02.938 }' 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:02.938 10:34:39 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:03.504 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:03.504 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.504 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:03.504 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:03.504 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.504 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.504 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:03.762 "name": "raid_bdev1", 00:27:03.762 "uuid": "72a6e8bb-5e68-404a-8601-877dd32e1592", 00:27:03.762 "strip_size_kb": 0, 00:27:03.762 "state": "online", 00:27:03.762 "raid_level": "raid1", 00:27:03.762 "superblock": true, 00:27:03.762 "num_base_bdevs": 4, 00:27:03.762 "num_base_bdevs_discovered": 2, 00:27:03.762 "num_base_bdevs_operational": 2, 00:27:03.762 "base_bdevs_list": [ 00:27:03.762 { 00:27:03.762 "name": null, 00:27:03.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.762 "is_configured": false, 00:27:03.762 "data_offset": 2048, 00:27:03.762 "data_size": 63488 00:27:03.762 }, 00:27:03.762 { 00:27:03.762 "name": null, 00:27:03.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.762 "is_configured": false, 00:27:03.762 "data_offset": 2048, 00:27:03.762 "data_size": 63488 00:27:03.762 }, 00:27:03.762 { 00:27:03.762 "name": "BaseBdev3", 00:27:03.762 "uuid": "17877661-d21b-54ed-8b6f-52bd128ebaef", 00:27:03.762 "is_configured": true, 00:27:03.762 "data_offset": 2048, 00:27:03.762 "data_size": 63488 00:27:03.762 }, 00:27:03.762 { 00:27:03.762 "name": "BaseBdev4", 00:27:03.762 "uuid": "953fee74-6078-5876-9f79-8fb93d00a1eb", 00:27:03.762 "is_configured": true, 00:27:03.762 "data_offset": 2048, 00:27:03.762 "data_size": 63488 00:27:03.762 } 00:27:03.762 ] 00:27:03.762 }' 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 607697 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 607697 ']' 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 607697 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:03.762 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 607697 00:27:04.020 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:04.020 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:04.020 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 607697' 00:27:04.020 killing process with pid 607697 00:27:04.020 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 607697 00:27:04.020 Received shutdown signal, test time was about 27.348051 seconds 00:27:04.020 00:27:04.020 Latency(us) 00:27:04.020 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:04.020 =================================================================================================================== 00:27:04.020 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:04.020 [2024-07-15 10:34:40.986038] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:04.020 [2024-07-15 10:34:40.986152] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:04.020 10:34:40 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 607697 00:27:04.020 [2024-07-15 10:34:40.986215] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:04.020 [2024-07-15 10:34:40.986228] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bf2c80 name raid_bdev1, state offline 00:27:04.020 [2024-07-15 10:34:41.034279] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:04.277 10:34:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:27:04.277 00:27:04.277 real 0m32.619s 00:27:04.277 user 0m51.317s 00:27:04.277 sys 0m5.094s 00:27:04.277 10:34:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:04.277 10:34:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:04.277 ************************************ 00:27:04.277 END TEST raid_rebuild_test_sb_io 00:27:04.277 ************************************ 00:27:04.277 10:34:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:04.277 10:34:41 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:27:04.277 10:34:41 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:27:04.277 10:34:41 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:27:04.277 10:34:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:04.277 10:34:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:04.277 10:34:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:04.277 ************************************ 00:27:04.277 START TEST raid_state_function_test_sb_4k 00:27:04.278 ************************************ 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=612376 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 612376' 00:27:04.278 Process raid pid: 612376 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 612376 /var/tmp/spdk-raid.sock 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 612376 ']' 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:04.278 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:04.278 10:34:41 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:04.278 [2024-07-15 10:34:41.420652] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:27:04.278 [2024-07-15 10:34:41.420721] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:04.535 [2024-07-15 10:34:41.540179] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:04.535 [2024-07-15 10:34:41.646601] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:04.535 [2024-07-15 10:34:41.714378] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:04.535 [2024-07-15 10:34:41.714413] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:05.494 [2024-07-15 10:34:42.574192] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:05.494 [2024-07-15 10:34:42.574237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:05.494 [2024-07-15 10:34:42.574248] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:05.494 [2024-07-15 10:34:42.574260] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.494 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:05.752 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:05.752 "name": "Existed_Raid", 00:27:05.752 "uuid": "59b88ea1-a6ce-42fa-8d80-dce492406d6a", 00:27:05.752 "strip_size_kb": 0, 00:27:05.752 "state": "configuring", 00:27:05.752 "raid_level": "raid1", 00:27:05.752 "superblock": true, 00:27:05.752 "num_base_bdevs": 2, 00:27:05.752 "num_base_bdevs_discovered": 0, 00:27:05.752 "num_base_bdevs_operational": 2, 00:27:05.752 "base_bdevs_list": [ 00:27:05.752 { 00:27:05.752 "name": "BaseBdev1", 00:27:05.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.752 "is_configured": false, 00:27:05.752 "data_offset": 0, 00:27:05.752 "data_size": 0 00:27:05.752 }, 00:27:05.752 { 00:27:05.752 "name": "BaseBdev2", 00:27:05.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.752 "is_configured": false, 00:27:05.752 "data_offset": 0, 00:27:05.752 "data_size": 0 00:27:05.752 } 00:27:05.752 ] 00:27:05.752 }' 00:27:05.752 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:05.752 10:34:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:06.315 10:34:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:06.572 [2024-07-15 10:34:43.660914] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:06.572 [2024-07-15 10:34:43.660955] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b8ba80 name Existed_Raid, state configuring 00:27:06.572 10:34:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:06.830 [2024-07-15 10:34:43.905597] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:06.830 [2024-07-15 10:34:43.905631] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:06.830 [2024-07-15 10:34:43.905642] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:06.830 [2024-07-15 10:34:43.905653] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:06.830 10:34:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:27:07.087 [2024-07-15 10:34:44.164137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:07.087 BaseBdev1 00:27:07.087 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:07.087 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:07.087 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:07.087 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:07.087 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:07.087 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:07.087 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:07.345 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:07.603 [ 00:27:07.603 { 00:27:07.603 "name": "BaseBdev1", 00:27:07.603 "aliases": [ 00:27:07.603 "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd" 00:27:07.603 ], 00:27:07.603 "product_name": "Malloc disk", 00:27:07.603 "block_size": 4096, 00:27:07.603 "num_blocks": 8192, 00:27:07.603 "uuid": "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd", 00:27:07.603 "assigned_rate_limits": { 00:27:07.603 "rw_ios_per_sec": 0, 00:27:07.603 "rw_mbytes_per_sec": 0, 00:27:07.603 "r_mbytes_per_sec": 0, 00:27:07.603 "w_mbytes_per_sec": 0 00:27:07.603 }, 00:27:07.603 "claimed": true, 00:27:07.603 "claim_type": "exclusive_write", 00:27:07.603 "zoned": false, 00:27:07.603 "supported_io_types": { 00:27:07.603 "read": true, 00:27:07.603 "write": true, 00:27:07.603 "unmap": true, 00:27:07.603 "flush": true, 00:27:07.603 "reset": true, 00:27:07.603 "nvme_admin": false, 00:27:07.603 "nvme_io": false, 00:27:07.603 "nvme_io_md": false, 00:27:07.603 "write_zeroes": true, 00:27:07.603 "zcopy": true, 00:27:07.603 "get_zone_info": false, 00:27:07.603 "zone_management": false, 00:27:07.603 "zone_append": false, 00:27:07.603 "compare": false, 00:27:07.603 "compare_and_write": false, 00:27:07.603 "abort": true, 00:27:07.603 "seek_hole": false, 00:27:07.603 "seek_data": false, 00:27:07.603 "copy": true, 00:27:07.603 "nvme_iov_md": false 00:27:07.603 }, 00:27:07.603 "memory_domains": [ 00:27:07.603 { 00:27:07.603 "dma_device_id": "system", 00:27:07.603 "dma_device_type": 1 00:27:07.603 }, 00:27:07.603 { 00:27:07.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:07.603 "dma_device_type": 2 00:27:07.603 } 00:27:07.603 ], 00:27:07.603 "driver_specific": {} 00:27:07.603 } 00:27:07.603 ] 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:07.603 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:07.604 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:07.604 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:07.604 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:07.604 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.604 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:07.862 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:07.862 "name": "Existed_Raid", 00:27:07.862 "uuid": "3e4881a9-2a25-43d0-96c6-1a83ddd4ab6b", 00:27:07.862 "strip_size_kb": 0, 00:27:07.862 "state": "configuring", 00:27:07.862 "raid_level": "raid1", 00:27:07.862 "superblock": true, 00:27:07.862 "num_base_bdevs": 2, 00:27:07.862 "num_base_bdevs_discovered": 1, 00:27:07.862 "num_base_bdevs_operational": 2, 00:27:07.862 "base_bdevs_list": [ 00:27:07.862 { 00:27:07.862 "name": "BaseBdev1", 00:27:07.862 "uuid": "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd", 00:27:07.862 "is_configured": true, 00:27:07.862 "data_offset": 256, 00:27:07.862 "data_size": 7936 00:27:07.862 }, 00:27:07.862 { 00:27:07.862 "name": "BaseBdev2", 00:27:07.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.862 "is_configured": false, 00:27:07.862 "data_offset": 0, 00:27:07.862 "data_size": 0 00:27:07.862 } 00:27:07.862 ] 00:27:07.862 }' 00:27:07.862 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:07.862 10:34:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:08.428 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:08.686 [2024-07-15 10:34:45.732286] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:08.686 [2024-07-15 10:34:45.732328] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b8b350 name Existed_Raid, state configuring 00:27:08.686 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:08.944 [2024-07-15 10:34:45.968959] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:08.944 [2024-07-15 10:34:45.970450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:08.944 [2024-07-15 10:34:45.970481] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.944 10:34:45 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:09.201 10:34:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:09.201 "name": "Existed_Raid", 00:27:09.201 "uuid": "a9c69158-a8a6-47c0-acc9-7730c2e7f94e", 00:27:09.201 "strip_size_kb": 0, 00:27:09.201 "state": "configuring", 00:27:09.201 "raid_level": "raid1", 00:27:09.201 "superblock": true, 00:27:09.201 "num_base_bdevs": 2, 00:27:09.201 "num_base_bdevs_discovered": 1, 00:27:09.201 "num_base_bdevs_operational": 2, 00:27:09.201 "base_bdevs_list": [ 00:27:09.201 { 00:27:09.201 "name": "BaseBdev1", 00:27:09.201 "uuid": "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd", 00:27:09.201 "is_configured": true, 00:27:09.201 "data_offset": 256, 00:27:09.201 "data_size": 7936 00:27:09.201 }, 00:27:09.201 { 00:27:09.201 "name": "BaseBdev2", 00:27:09.201 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:09.201 "is_configured": false, 00:27:09.201 "data_offset": 0, 00:27:09.201 "data_size": 0 00:27:09.201 } 00:27:09.201 ] 00:27:09.201 }' 00:27:09.201 10:34:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:09.201 10:34:46 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:09.767 10:34:46 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:27:10.025 [2024-07-15 10:34:47.108358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:10.025 [2024-07-15 10:34:47.108526] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b8c000 00:27:10.025 [2024-07-15 10:34:47.108540] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:10.025 [2024-07-15 10:34:47.108713] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1aa60c0 00:27:10.025 [2024-07-15 10:34:47.108836] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b8c000 00:27:10.025 [2024-07-15 10:34:47.108847] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b8c000 00:27:10.025 [2024-07-15 10:34:47.108954] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:10.025 BaseBdev2 00:27:10.025 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:10.025 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:10.025 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:10.025 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:27:10.025 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:10.025 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:10.025 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:10.283 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:10.541 [ 00:27:10.541 { 00:27:10.541 "name": "BaseBdev2", 00:27:10.541 "aliases": [ 00:27:10.541 "be970164-ee1a-47d3-8d58-617bf60f19c4" 00:27:10.541 ], 00:27:10.541 "product_name": "Malloc disk", 00:27:10.541 "block_size": 4096, 00:27:10.541 "num_blocks": 8192, 00:27:10.541 "uuid": "be970164-ee1a-47d3-8d58-617bf60f19c4", 00:27:10.541 "assigned_rate_limits": { 00:27:10.541 "rw_ios_per_sec": 0, 00:27:10.541 "rw_mbytes_per_sec": 0, 00:27:10.541 "r_mbytes_per_sec": 0, 00:27:10.541 "w_mbytes_per_sec": 0 00:27:10.541 }, 00:27:10.541 "claimed": true, 00:27:10.541 "claim_type": "exclusive_write", 00:27:10.541 "zoned": false, 00:27:10.541 "supported_io_types": { 00:27:10.541 "read": true, 00:27:10.541 "write": true, 00:27:10.541 "unmap": true, 00:27:10.541 "flush": true, 00:27:10.541 "reset": true, 00:27:10.541 "nvme_admin": false, 00:27:10.541 "nvme_io": false, 00:27:10.542 "nvme_io_md": false, 00:27:10.542 "write_zeroes": true, 00:27:10.542 "zcopy": true, 00:27:10.542 "get_zone_info": false, 00:27:10.542 "zone_management": false, 00:27:10.542 "zone_append": false, 00:27:10.542 "compare": false, 00:27:10.542 "compare_and_write": false, 00:27:10.542 "abort": true, 00:27:10.542 "seek_hole": false, 00:27:10.542 "seek_data": false, 00:27:10.542 "copy": true, 00:27:10.542 "nvme_iov_md": false 00:27:10.542 }, 00:27:10.542 "memory_domains": [ 00:27:10.542 { 00:27:10.542 "dma_device_id": "system", 00:27:10.542 "dma_device_type": 1 00:27:10.542 }, 00:27:10.542 { 00:27:10.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.542 "dma_device_type": 2 00:27:10.542 } 00:27:10.542 ], 00:27:10.542 "driver_specific": {} 00:27:10.542 } 00:27:10.542 ] 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.542 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:10.800 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.800 "name": "Existed_Raid", 00:27:10.800 "uuid": "a9c69158-a8a6-47c0-acc9-7730c2e7f94e", 00:27:10.800 "strip_size_kb": 0, 00:27:10.800 "state": "online", 00:27:10.800 "raid_level": "raid1", 00:27:10.800 "superblock": true, 00:27:10.800 "num_base_bdevs": 2, 00:27:10.800 "num_base_bdevs_discovered": 2, 00:27:10.800 "num_base_bdevs_operational": 2, 00:27:10.800 "base_bdevs_list": [ 00:27:10.800 { 00:27:10.800 "name": "BaseBdev1", 00:27:10.800 "uuid": "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd", 00:27:10.800 "is_configured": true, 00:27:10.801 "data_offset": 256, 00:27:10.801 "data_size": 7936 00:27:10.801 }, 00:27:10.801 { 00:27:10.801 "name": "BaseBdev2", 00:27:10.801 "uuid": "be970164-ee1a-47d3-8d58-617bf60f19c4", 00:27:10.801 "is_configured": true, 00:27:10.801 "data_offset": 256, 00:27:10.801 "data_size": 7936 00:27:10.801 } 00:27:10.801 ] 00:27:10.801 }' 00:27:10.801 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.801 10:34:47 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:11.387 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:11.643 [2024-07-15 10:34:48.708992] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:11.643 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:11.643 "name": "Existed_Raid", 00:27:11.643 "aliases": [ 00:27:11.643 "a9c69158-a8a6-47c0-acc9-7730c2e7f94e" 00:27:11.643 ], 00:27:11.643 "product_name": "Raid Volume", 00:27:11.643 "block_size": 4096, 00:27:11.643 "num_blocks": 7936, 00:27:11.643 "uuid": "a9c69158-a8a6-47c0-acc9-7730c2e7f94e", 00:27:11.643 "assigned_rate_limits": { 00:27:11.643 "rw_ios_per_sec": 0, 00:27:11.643 "rw_mbytes_per_sec": 0, 00:27:11.643 "r_mbytes_per_sec": 0, 00:27:11.643 "w_mbytes_per_sec": 0 00:27:11.643 }, 00:27:11.643 "claimed": false, 00:27:11.643 "zoned": false, 00:27:11.643 "supported_io_types": { 00:27:11.643 "read": true, 00:27:11.643 "write": true, 00:27:11.643 "unmap": false, 00:27:11.643 "flush": false, 00:27:11.643 "reset": true, 00:27:11.643 "nvme_admin": false, 00:27:11.643 "nvme_io": false, 00:27:11.644 "nvme_io_md": false, 00:27:11.644 "write_zeroes": true, 00:27:11.644 "zcopy": false, 00:27:11.644 "get_zone_info": false, 00:27:11.644 "zone_management": false, 00:27:11.644 "zone_append": false, 00:27:11.644 "compare": false, 00:27:11.644 "compare_and_write": false, 00:27:11.644 "abort": false, 00:27:11.644 "seek_hole": false, 00:27:11.644 "seek_data": false, 00:27:11.644 "copy": false, 00:27:11.644 "nvme_iov_md": false 00:27:11.644 }, 00:27:11.644 "memory_domains": [ 00:27:11.644 { 00:27:11.644 "dma_device_id": "system", 00:27:11.644 "dma_device_type": 1 00:27:11.644 }, 00:27:11.644 { 00:27:11.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.644 "dma_device_type": 2 00:27:11.644 }, 00:27:11.644 { 00:27:11.644 "dma_device_id": "system", 00:27:11.644 "dma_device_type": 1 00:27:11.644 }, 00:27:11.644 { 00:27:11.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.644 "dma_device_type": 2 00:27:11.644 } 00:27:11.644 ], 00:27:11.644 "driver_specific": { 00:27:11.644 "raid": { 00:27:11.644 "uuid": "a9c69158-a8a6-47c0-acc9-7730c2e7f94e", 00:27:11.644 "strip_size_kb": 0, 00:27:11.644 "state": "online", 00:27:11.644 "raid_level": "raid1", 00:27:11.644 "superblock": true, 00:27:11.644 "num_base_bdevs": 2, 00:27:11.644 "num_base_bdevs_discovered": 2, 00:27:11.644 "num_base_bdevs_operational": 2, 00:27:11.644 "base_bdevs_list": [ 00:27:11.644 { 00:27:11.644 "name": "BaseBdev1", 00:27:11.644 "uuid": "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd", 00:27:11.644 "is_configured": true, 00:27:11.644 "data_offset": 256, 00:27:11.644 "data_size": 7936 00:27:11.644 }, 00:27:11.644 { 00:27:11.644 "name": "BaseBdev2", 00:27:11.644 "uuid": "be970164-ee1a-47d3-8d58-617bf60f19c4", 00:27:11.644 "is_configured": true, 00:27:11.644 "data_offset": 256, 00:27:11.644 "data_size": 7936 00:27:11.644 } 00:27:11.644 ] 00:27:11.644 } 00:27:11.644 } 00:27:11.644 }' 00:27:11.644 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:11.644 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:11.644 BaseBdev2' 00:27:11.644 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:11.644 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:11.644 10:34:48 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:11.900 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:11.900 "name": "BaseBdev1", 00:27:11.900 "aliases": [ 00:27:11.900 "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd" 00:27:11.900 ], 00:27:11.900 "product_name": "Malloc disk", 00:27:11.900 "block_size": 4096, 00:27:11.900 "num_blocks": 8192, 00:27:11.900 "uuid": "1e5d178e-fbe2-4cfe-b45f-03ad786aafdd", 00:27:11.900 "assigned_rate_limits": { 00:27:11.900 "rw_ios_per_sec": 0, 00:27:11.900 "rw_mbytes_per_sec": 0, 00:27:11.900 "r_mbytes_per_sec": 0, 00:27:11.900 "w_mbytes_per_sec": 0 00:27:11.900 }, 00:27:11.900 "claimed": true, 00:27:11.900 "claim_type": "exclusive_write", 00:27:11.900 "zoned": false, 00:27:11.900 "supported_io_types": { 00:27:11.900 "read": true, 00:27:11.900 "write": true, 00:27:11.900 "unmap": true, 00:27:11.900 "flush": true, 00:27:11.900 "reset": true, 00:27:11.900 "nvme_admin": false, 00:27:11.900 "nvme_io": false, 00:27:11.900 "nvme_io_md": false, 00:27:11.900 "write_zeroes": true, 00:27:11.900 "zcopy": true, 00:27:11.900 "get_zone_info": false, 00:27:11.900 "zone_management": false, 00:27:11.900 "zone_append": false, 00:27:11.900 "compare": false, 00:27:11.900 "compare_and_write": false, 00:27:11.900 "abort": true, 00:27:11.900 "seek_hole": false, 00:27:11.900 "seek_data": false, 00:27:11.900 "copy": true, 00:27:11.900 "nvme_iov_md": false 00:27:11.900 }, 00:27:11.900 "memory_domains": [ 00:27:11.900 { 00:27:11.900 "dma_device_id": "system", 00:27:11.900 "dma_device_type": 1 00:27:11.900 }, 00:27:11.900 { 00:27:11.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:11.900 "dma_device_type": 2 00:27:11.900 } 00:27:11.900 ], 00:27:11.900 "driver_specific": {} 00:27:11.900 }' 00:27:11.900 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:11.900 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.156 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.414 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:12.414 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:12.414 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:12.414 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:12.672 "name": "BaseBdev2", 00:27:12.672 "aliases": [ 00:27:12.672 "be970164-ee1a-47d3-8d58-617bf60f19c4" 00:27:12.672 ], 00:27:12.672 "product_name": "Malloc disk", 00:27:12.672 "block_size": 4096, 00:27:12.672 "num_blocks": 8192, 00:27:12.672 "uuid": "be970164-ee1a-47d3-8d58-617bf60f19c4", 00:27:12.672 "assigned_rate_limits": { 00:27:12.672 "rw_ios_per_sec": 0, 00:27:12.672 "rw_mbytes_per_sec": 0, 00:27:12.672 "r_mbytes_per_sec": 0, 00:27:12.672 "w_mbytes_per_sec": 0 00:27:12.672 }, 00:27:12.672 "claimed": true, 00:27:12.672 "claim_type": "exclusive_write", 00:27:12.672 "zoned": false, 00:27:12.672 "supported_io_types": { 00:27:12.672 "read": true, 00:27:12.672 "write": true, 00:27:12.672 "unmap": true, 00:27:12.672 "flush": true, 00:27:12.672 "reset": true, 00:27:12.672 "nvme_admin": false, 00:27:12.672 "nvme_io": false, 00:27:12.672 "nvme_io_md": false, 00:27:12.672 "write_zeroes": true, 00:27:12.672 "zcopy": true, 00:27:12.672 "get_zone_info": false, 00:27:12.672 "zone_management": false, 00:27:12.672 "zone_append": false, 00:27:12.672 "compare": false, 00:27:12.672 "compare_and_write": false, 00:27:12.672 "abort": true, 00:27:12.672 "seek_hole": false, 00:27:12.672 "seek_data": false, 00:27:12.672 "copy": true, 00:27:12.672 "nvme_iov_md": false 00:27:12.672 }, 00:27:12.672 "memory_domains": [ 00:27:12.672 { 00:27:12.672 "dma_device_id": "system", 00:27:12.672 "dma_device_type": 1 00:27:12.672 }, 00:27:12.672 { 00:27:12.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:12.672 "dma_device_type": 2 00:27:12.672 } 00:27:12.672 ], 00:27:12.672 "driver_specific": {} 00:27:12.672 }' 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.672 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:12.930 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:12.930 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.930 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:12.930 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:12.930 10:34:49 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:13.189 [2024-07-15 10:34:50.200738] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.189 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:13.448 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.448 "name": "Existed_Raid", 00:27:13.448 "uuid": "a9c69158-a8a6-47c0-acc9-7730c2e7f94e", 00:27:13.448 "strip_size_kb": 0, 00:27:13.448 "state": "online", 00:27:13.448 "raid_level": "raid1", 00:27:13.448 "superblock": true, 00:27:13.448 "num_base_bdevs": 2, 00:27:13.448 "num_base_bdevs_discovered": 1, 00:27:13.448 "num_base_bdevs_operational": 1, 00:27:13.448 "base_bdevs_list": [ 00:27:13.448 { 00:27:13.448 "name": null, 00:27:13.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:13.448 "is_configured": false, 00:27:13.448 "data_offset": 256, 00:27:13.448 "data_size": 7936 00:27:13.448 }, 00:27:13.448 { 00:27:13.448 "name": "BaseBdev2", 00:27:13.448 "uuid": "be970164-ee1a-47d3-8d58-617bf60f19c4", 00:27:13.448 "is_configured": true, 00:27:13.448 "data_offset": 256, 00:27:13.448 "data_size": 7936 00:27:13.448 } 00:27:13.448 ] 00:27:13.448 }' 00:27:13.448 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.448 10:34:50 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:14.015 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:14.015 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:14.015 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.015 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:14.273 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:14.273 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:14.273 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:14.532 [2024-07-15 10:34:51.497323] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:14.532 [2024-07-15 10:34:51.497411] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:14.532 [2024-07-15 10:34:51.509723] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:14.532 [2024-07-15 10:34:51.509761] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:14.532 [2024-07-15 10:34:51.509773] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b8c000 name Existed_Raid, state offline 00:27:14.532 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:14.532 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:14.532 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.532 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 612376 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 612376 ']' 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 612376 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 612376 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 612376' 00:27:14.791 killing process with pid 612376 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 612376 00:27:14.791 [2024-07-15 10:34:51.823397] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:14.791 10:34:51 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 612376 00:27:14.791 [2024-07-15 10:34:51.824286] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:15.050 10:34:52 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:27:15.050 00:27:15.050 real 0m10.674s 00:27:15.050 user 0m18.997s 00:27:15.050 sys 0m2.017s 00:27:15.050 10:34:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:15.050 10:34:52 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:15.050 ************************************ 00:27:15.050 END TEST raid_state_function_test_sb_4k 00:27:15.050 ************************************ 00:27:15.050 10:34:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:15.050 10:34:52 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:27:15.050 10:34:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:15.050 10:34:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:15.050 10:34:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:15.050 ************************************ 00:27:15.050 START TEST raid_superblock_test_4k 00:27:15.050 ************************************ 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:15.050 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=614006 00:27:15.051 10:34:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 614006 /var/tmp/spdk-raid.sock 00:27:15.051 10:34:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 614006 ']' 00:27:15.051 10:34:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:15.051 10:34:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:15.051 10:34:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:15.051 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:15.051 10:34:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:15.051 10:34:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:15.051 [2024-07-15 10:34:52.166333] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:27:15.051 [2024-07-15 10:34:52.166398] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid614006 ] 00:27:15.310 [2024-07-15 10:34:52.287515] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:15.310 [2024-07-15 10:34:52.389633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:15.310 [2024-07-15 10:34:52.449450] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:15.310 [2024-07-15 10:34:52.449488] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:27:16.243 malloc1 00:27:16.243 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:16.501 [2024-07-15 10:34:53.578541] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:16.501 [2024-07-15 10:34:53.578598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:16.501 [2024-07-15 10:34:53.578620] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1611570 00:27:16.501 [2024-07-15 10:34:53.578633] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:16.501 [2024-07-15 10:34:53.580393] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:16.501 [2024-07-15 10:34:53.580423] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:16.501 pt1 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:16.501 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:27:16.759 malloc2 00:27:16.759 10:34:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:17.017 [2024-07-15 10:34:54.072664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:17.017 [2024-07-15 10:34:54.072711] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:17.017 [2024-07-15 10:34:54.072729] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1612970 00:27:17.017 [2024-07-15 10:34:54.072742] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:17.017 [2024-07-15 10:34:54.074348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:17.017 [2024-07-15 10:34:54.074378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:17.017 pt2 00:27:17.017 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:17.017 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:17.017 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:17.275 [2024-07-15 10:34:54.305298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:17.275 [2024-07-15 10:34:54.306591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:17.275 [2024-07-15 10:34:54.306740] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17b5270 00:27:17.275 [2024-07-15 10:34:54.306753] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:17.275 [2024-07-15 10:34:54.306963] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16090e0 00:27:17.275 [2024-07-15 10:34:54.307111] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17b5270 00:27:17.275 [2024-07-15 10:34:54.307121] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17b5270 00:27:17.275 [2024-07-15 10:34:54.307220] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.275 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:17.533 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:17.533 "name": "raid_bdev1", 00:27:17.533 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:17.533 "strip_size_kb": 0, 00:27:17.533 "state": "online", 00:27:17.533 "raid_level": "raid1", 00:27:17.533 "superblock": true, 00:27:17.533 "num_base_bdevs": 2, 00:27:17.533 "num_base_bdevs_discovered": 2, 00:27:17.533 "num_base_bdevs_operational": 2, 00:27:17.533 "base_bdevs_list": [ 00:27:17.533 { 00:27:17.533 "name": "pt1", 00:27:17.533 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:17.533 "is_configured": true, 00:27:17.533 "data_offset": 256, 00:27:17.533 "data_size": 7936 00:27:17.533 }, 00:27:17.533 { 00:27:17.533 "name": "pt2", 00:27:17.533 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:17.533 "is_configured": true, 00:27:17.533 "data_offset": 256, 00:27:17.533 "data_size": 7936 00:27:17.533 } 00:27:17.533 ] 00:27:17.533 }' 00:27:17.533 10:34:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:17.533 10:34:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:18.097 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:18.354 [2024-07-15 10:34:55.380363] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:18.354 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:18.354 "name": "raid_bdev1", 00:27:18.354 "aliases": [ 00:27:18.354 "a685bf63-34b0-4bc2-b013-572912b0e7a9" 00:27:18.354 ], 00:27:18.354 "product_name": "Raid Volume", 00:27:18.354 "block_size": 4096, 00:27:18.354 "num_blocks": 7936, 00:27:18.354 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:18.354 "assigned_rate_limits": { 00:27:18.354 "rw_ios_per_sec": 0, 00:27:18.354 "rw_mbytes_per_sec": 0, 00:27:18.354 "r_mbytes_per_sec": 0, 00:27:18.354 "w_mbytes_per_sec": 0 00:27:18.354 }, 00:27:18.354 "claimed": false, 00:27:18.354 "zoned": false, 00:27:18.354 "supported_io_types": { 00:27:18.354 "read": true, 00:27:18.354 "write": true, 00:27:18.354 "unmap": false, 00:27:18.354 "flush": false, 00:27:18.354 "reset": true, 00:27:18.354 "nvme_admin": false, 00:27:18.354 "nvme_io": false, 00:27:18.354 "nvme_io_md": false, 00:27:18.354 "write_zeroes": true, 00:27:18.354 "zcopy": false, 00:27:18.354 "get_zone_info": false, 00:27:18.354 "zone_management": false, 00:27:18.354 "zone_append": false, 00:27:18.354 "compare": false, 00:27:18.354 "compare_and_write": false, 00:27:18.354 "abort": false, 00:27:18.354 "seek_hole": false, 00:27:18.354 "seek_data": false, 00:27:18.354 "copy": false, 00:27:18.354 "nvme_iov_md": false 00:27:18.354 }, 00:27:18.354 "memory_domains": [ 00:27:18.354 { 00:27:18.354 "dma_device_id": "system", 00:27:18.354 "dma_device_type": 1 00:27:18.354 }, 00:27:18.354 { 00:27:18.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:18.354 "dma_device_type": 2 00:27:18.354 }, 00:27:18.354 { 00:27:18.354 "dma_device_id": "system", 00:27:18.354 "dma_device_type": 1 00:27:18.354 }, 00:27:18.354 { 00:27:18.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:18.354 "dma_device_type": 2 00:27:18.354 } 00:27:18.354 ], 00:27:18.354 "driver_specific": { 00:27:18.354 "raid": { 00:27:18.354 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:18.354 "strip_size_kb": 0, 00:27:18.354 "state": "online", 00:27:18.354 "raid_level": "raid1", 00:27:18.354 "superblock": true, 00:27:18.354 "num_base_bdevs": 2, 00:27:18.354 "num_base_bdevs_discovered": 2, 00:27:18.354 "num_base_bdevs_operational": 2, 00:27:18.354 "base_bdevs_list": [ 00:27:18.354 { 00:27:18.354 "name": "pt1", 00:27:18.354 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:18.354 "is_configured": true, 00:27:18.354 "data_offset": 256, 00:27:18.354 "data_size": 7936 00:27:18.354 }, 00:27:18.354 { 00:27:18.354 "name": "pt2", 00:27:18.354 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:18.354 "is_configured": true, 00:27:18.354 "data_offset": 256, 00:27:18.354 "data_size": 7936 00:27:18.354 } 00:27:18.354 ] 00:27:18.354 } 00:27:18.354 } 00:27:18.354 }' 00:27:18.354 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:18.354 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:18.354 pt2' 00:27:18.354 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:18.354 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:18.354 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:18.611 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:18.611 "name": "pt1", 00:27:18.611 "aliases": [ 00:27:18.611 "00000000-0000-0000-0000-000000000001" 00:27:18.611 ], 00:27:18.611 "product_name": "passthru", 00:27:18.611 "block_size": 4096, 00:27:18.611 "num_blocks": 8192, 00:27:18.611 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:18.611 "assigned_rate_limits": { 00:27:18.611 "rw_ios_per_sec": 0, 00:27:18.611 "rw_mbytes_per_sec": 0, 00:27:18.611 "r_mbytes_per_sec": 0, 00:27:18.611 "w_mbytes_per_sec": 0 00:27:18.611 }, 00:27:18.611 "claimed": true, 00:27:18.611 "claim_type": "exclusive_write", 00:27:18.611 "zoned": false, 00:27:18.611 "supported_io_types": { 00:27:18.611 "read": true, 00:27:18.611 "write": true, 00:27:18.611 "unmap": true, 00:27:18.611 "flush": true, 00:27:18.611 "reset": true, 00:27:18.611 "nvme_admin": false, 00:27:18.611 "nvme_io": false, 00:27:18.611 "nvme_io_md": false, 00:27:18.611 "write_zeroes": true, 00:27:18.611 "zcopy": true, 00:27:18.611 "get_zone_info": false, 00:27:18.611 "zone_management": false, 00:27:18.611 "zone_append": false, 00:27:18.611 "compare": false, 00:27:18.611 "compare_and_write": false, 00:27:18.611 "abort": true, 00:27:18.611 "seek_hole": false, 00:27:18.611 "seek_data": false, 00:27:18.611 "copy": true, 00:27:18.611 "nvme_iov_md": false 00:27:18.611 }, 00:27:18.611 "memory_domains": [ 00:27:18.611 { 00:27:18.611 "dma_device_id": "system", 00:27:18.611 "dma_device_type": 1 00:27:18.611 }, 00:27:18.611 { 00:27:18.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:18.611 "dma_device_type": 2 00:27:18.611 } 00:27:18.611 ], 00:27:18.611 "driver_specific": { 00:27:18.611 "passthru": { 00:27:18.611 "name": "pt1", 00:27:18.611 "base_bdev_name": "malloc1" 00:27:18.611 } 00:27:18.611 } 00:27:18.611 }' 00:27:18.611 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:18.611 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:18.611 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:18.611 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.869 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:18.869 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:18.869 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.869 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:18.869 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:18.869 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.869 10:34:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:18.869 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:18.869 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:18.869 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:18.869 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:19.147 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:19.147 "name": "pt2", 00:27:19.147 "aliases": [ 00:27:19.147 "00000000-0000-0000-0000-000000000002" 00:27:19.147 ], 00:27:19.147 "product_name": "passthru", 00:27:19.147 "block_size": 4096, 00:27:19.147 "num_blocks": 8192, 00:27:19.147 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:19.147 "assigned_rate_limits": { 00:27:19.147 "rw_ios_per_sec": 0, 00:27:19.147 "rw_mbytes_per_sec": 0, 00:27:19.147 "r_mbytes_per_sec": 0, 00:27:19.147 "w_mbytes_per_sec": 0 00:27:19.147 }, 00:27:19.147 "claimed": true, 00:27:19.147 "claim_type": "exclusive_write", 00:27:19.147 "zoned": false, 00:27:19.147 "supported_io_types": { 00:27:19.147 "read": true, 00:27:19.147 "write": true, 00:27:19.147 "unmap": true, 00:27:19.147 "flush": true, 00:27:19.147 "reset": true, 00:27:19.147 "nvme_admin": false, 00:27:19.147 "nvme_io": false, 00:27:19.147 "nvme_io_md": false, 00:27:19.147 "write_zeroes": true, 00:27:19.147 "zcopy": true, 00:27:19.147 "get_zone_info": false, 00:27:19.147 "zone_management": false, 00:27:19.147 "zone_append": false, 00:27:19.147 "compare": false, 00:27:19.147 "compare_and_write": false, 00:27:19.147 "abort": true, 00:27:19.147 "seek_hole": false, 00:27:19.147 "seek_data": false, 00:27:19.147 "copy": true, 00:27:19.147 "nvme_iov_md": false 00:27:19.147 }, 00:27:19.147 "memory_domains": [ 00:27:19.147 { 00:27:19.147 "dma_device_id": "system", 00:27:19.147 "dma_device_type": 1 00:27:19.147 }, 00:27:19.147 { 00:27:19.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:19.147 "dma_device_type": 2 00:27:19.147 } 00:27:19.147 ], 00:27:19.147 "driver_specific": { 00:27:19.147 "passthru": { 00:27:19.147 "name": "pt2", 00:27:19.147 "base_bdev_name": "malloc2" 00:27:19.147 } 00:27:19.147 } 00:27:19.147 }' 00:27:19.147 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:19.410 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:19.667 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:19.667 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:19.667 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:19.667 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:19.924 [2024-07-15 10:34:56.884323] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:19.924 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a685bf63-34b0-4bc2-b013-572912b0e7a9 00:27:19.924 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z a685bf63-34b0-4bc2-b013-572912b0e7a9 ']' 00:27:19.924 10:34:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:20.181 [2024-07-15 10:34:57.132743] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:20.181 [2024-07-15 10:34:57.132764] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:20.181 [2024-07-15 10:34:57.132818] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:20.181 [2024-07-15 10:34:57.132875] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:20.181 [2024-07-15 10:34:57.132887] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b5270 name raid_bdev1, state offline 00:27:20.181 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.181 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:20.437 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:20.437 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:20.437 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:20.437 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:20.695 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:20.695 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:20.953 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:20.953 10:34:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:20.953 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:21.211 [2024-07-15 10:34:58.367998] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:21.211 [2024-07-15 10:34:58.369417] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:21.211 [2024-07-15 10:34:58.369473] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:21.211 [2024-07-15 10:34:58.369514] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:21.211 [2024-07-15 10:34:58.369532] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:21.211 [2024-07-15 10:34:58.369542] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b4ff0 name raid_bdev1, state configuring 00:27:21.211 request: 00:27:21.211 { 00:27:21.211 "name": "raid_bdev1", 00:27:21.211 "raid_level": "raid1", 00:27:21.211 "base_bdevs": [ 00:27:21.211 "malloc1", 00:27:21.211 "malloc2" 00:27:21.211 ], 00:27:21.211 "superblock": false, 00:27:21.211 "method": "bdev_raid_create", 00:27:21.211 "req_id": 1 00:27:21.211 } 00:27:21.211 Got JSON-RPC error response 00:27:21.211 response: 00:27:21.211 { 00:27:21.211 "code": -17, 00:27:21.211 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:21.211 } 00:27:21.211 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:27:21.211 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:21.211 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:21.211 10:34:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:21.211 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.211 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:21.469 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:21.469 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:21.469 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:21.727 [2024-07-15 10:34:58.857217] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:21.727 [2024-07-15 10:34:58.857270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.727 [2024-07-15 10:34:58.857293] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16117a0 00:27:21.727 [2024-07-15 10:34:58.857306] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.727 [2024-07-15 10:34:58.858969] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.727 [2024-07-15 10:34:58.858997] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:21.727 [2024-07-15 10:34:58.859067] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:21.727 [2024-07-15 10:34:58.859094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:21.727 pt1 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.727 10:34:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.985 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.985 "name": "raid_bdev1", 00:27:21.985 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:21.985 "strip_size_kb": 0, 00:27:21.985 "state": "configuring", 00:27:21.985 "raid_level": "raid1", 00:27:21.985 "superblock": true, 00:27:21.985 "num_base_bdevs": 2, 00:27:21.985 "num_base_bdevs_discovered": 1, 00:27:21.985 "num_base_bdevs_operational": 2, 00:27:21.985 "base_bdevs_list": [ 00:27:21.985 { 00:27:21.985 "name": "pt1", 00:27:21.985 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:21.985 "is_configured": true, 00:27:21.985 "data_offset": 256, 00:27:21.985 "data_size": 7936 00:27:21.985 }, 00:27:21.985 { 00:27:21.985 "name": null, 00:27:21.985 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:21.985 "is_configured": false, 00:27:21.985 "data_offset": 256, 00:27:21.985 "data_size": 7936 00:27:21.985 } 00:27:21.985 ] 00:27:21.985 }' 00:27:21.985 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.985 10:34:59 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:22.550 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:22.550 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:22.550 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:22.550 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:22.807 [2024-07-15 10:34:59.904004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:22.807 [2024-07-15 10:34:59.904065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:22.807 [2024-07-15 10:34:59.904085] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17a96f0 00:27:22.807 [2024-07-15 10:34:59.904098] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:22.807 [2024-07-15 10:34:59.904450] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:22.807 [2024-07-15 10:34:59.904470] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:22.807 [2024-07-15 10:34:59.904536] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:22.807 [2024-07-15 10:34:59.904556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:22.807 [2024-07-15 10:34:59.904652] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17aa590 00:27:22.807 [2024-07-15 10:34:59.904663] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:22.807 [2024-07-15 10:34:59.904832] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x160b540 00:27:22.807 [2024-07-15 10:34:59.904971] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17aa590 00:27:22.807 [2024-07-15 10:34:59.904982] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17aa590 00:27:22.807 [2024-07-15 10:34:59.905081] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:22.807 pt2 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:22.807 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:22.808 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:22.808 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:22.808 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:22.808 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.808 10:34:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.065 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.066 "name": "raid_bdev1", 00:27:23.066 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:23.066 "strip_size_kb": 0, 00:27:23.066 "state": "online", 00:27:23.066 "raid_level": "raid1", 00:27:23.066 "superblock": true, 00:27:23.066 "num_base_bdevs": 2, 00:27:23.066 "num_base_bdevs_discovered": 2, 00:27:23.066 "num_base_bdevs_operational": 2, 00:27:23.066 "base_bdevs_list": [ 00:27:23.066 { 00:27:23.066 "name": "pt1", 00:27:23.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:23.066 "is_configured": true, 00:27:23.066 "data_offset": 256, 00:27:23.066 "data_size": 7936 00:27:23.066 }, 00:27:23.066 { 00:27:23.066 "name": "pt2", 00:27:23.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:23.066 "is_configured": true, 00:27:23.066 "data_offset": 256, 00:27:23.066 "data_size": 7936 00:27:23.066 } 00:27:23.066 ] 00:27:23.066 }' 00:27:23.066 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.066 10:35:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:23.631 10:35:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:23.890 [2024-07-15 10:35:00.991127] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:23.890 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:23.890 "name": "raid_bdev1", 00:27:23.890 "aliases": [ 00:27:23.890 "a685bf63-34b0-4bc2-b013-572912b0e7a9" 00:27:23.890 ], 00:27:23.890 "product_name": "Raid Volume", 00:27:23.890 "block_size": 4096, 00:27:23.890 "num_blocks": 7936, 00:27:23.890 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:23.890 "assigned_rate_limits": { 00:27:23.890 "rw_ios_per_sec": 0, 00:27:23.890 "rw_mbytes_per_sec": 0, 00:27:23.890 "r_mbytes_per_sec": 0, 00:27:23.890 "w_mbytes_per_sec": 0 00:27:23.890 }, 00:27:23.890 "claimed": false, 00:27:23.890 "zoned": false, 00:27:23.890 "supported_io_types": { 00:27:23.890 "read": true, 00:27:23.890 "write": true, 00:27:23.890 "unmap": false, 00:27:23.890 "flush": false, 00:27:23.890 "reset": true, 00:27:23.890 "nvme_admin": false, 00:27:23.890 "nvme_io": false, 00:27:23.890 "nvme_io_md": false, 00:27:23.890 "write_zeroes": true, 00:27:23.890 "zcopy": false, 00:27:23.890 "get_zone_info": false, 00:27:23.890 "zone_management": false, 00:27:23.890 "zone_append": false, 00:27:23.890 "compare": false, 00:27:23.890 "compare_and_write": false, 00:27:23.890 "abort": false, 00:27:23.890 "seek_hole": false, 00:27:23.890 "seek_data": false, 00:27:23.890 "copy": false, 00:27:23.890 "nvme_iov_md": false 00:27:23.890 }, 00:27:23.890 "memory_domains": [ 00:27:23.890 { 00:27:23.890 "dma_device_id": "system", 00:27:23.890 "dma_device_type": 1 00:27:23.890 }, 00:27:23.890 { 00:27:23.890 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.890 "dma_device_type": 2 00:27:23.890 }, 00:27:23.890 { 00:27:23.890 "dma_device_id": "system", 00:27:23.890 "dma_device_type": 1 00:27:23.890 }, 00:27:23.890 { 00:27:23.890 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:23.890 "dma_device_type": 2 00:27:23.890 } 00:27:23.890 ], 00:27:23.890 "driver_specific": { 00:27:23.890 "raid": { 00:27:23.890 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:23.890 "strip_size_kb": 0, 00:27:23.890 "state": "online", 00:27:23.890 "raid_level": "raid1", 00:27:23.890 "superblock": true, 00:27:23.890 "num_base_bdevs": 2, 00:27:23.890 "num_base_bdevs_discovered": 2, 00:27:23.890 "num_base_bdevs_operational": 2, 00:27:23.890 "base_bdevs_list": [ 00:27:23.890 { 00:27:23.890 "name": "pt1", 00:27:23.890 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:23.890 "is_configured": true, 00:27:23.890 "data_offset": 256, 00:27:23.890 "data_size": 7936 00:27:23.890 }, 00:27:23.890 { 00:27:23.890 "name": "pt2", 00:27:23.890 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:23.890 "is_configured": true, 00:27:23.890 "data_offset": 256, 00:27:23.890 "data_size": 7936 00:27:23.890 } 00:27:23.890 ] 00:27:23.890 } 00:27:23.890 } 00:27:23.890 }' 00:27:23.890 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:23.890 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:23.890 pt2' 00:27:23.890 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:23.890 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:23.890 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:24.149 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:24.149 "name": "pt1", 00:27:24.149 "aliases": [ 00:27:24.149 "00000000-0000-0000-0000-000000000001" 00:27:24.149 ], 00:27:24.149 "product_name": "passthru", 00:27:24.149 "block_size": 4096, 00:27:24.149 "num_blocks": 8192, 00:27:24.149 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:24.149 "assigned_rate_limits": { 00:27:24.149 "rw_ios_per_sec": 0, 00:27:24.149 "rw_mbytes_per_sec": 0, 00:27:24.149 "r_mbytes_per_sec": 0, 00:27:24.149 "w_mbytes_per_sec": 0 00:27:24.149 }, 00:27:24.149 "claimed": true, 00:27:24.149 "claim_type": "exclusive_write", 00:27:24.149 "zoned": false, 00:27:24.149 "supported_io_types": { 00:27:24.149 "read": true, 00:27:24.149 "write": true, 00:27:24.149 "unmap": true, 00:27:24.149 "flush": true, 00:27:24.149 "reset": true, 00:27:24.149 "nvme_admin": false, 00:27:24.149 "nvme_io": false, 00:27:24.149 "nvme_io_md": false, 00:27:24.149 "write_zeroes": true, 00:27:24.149 "zcopy": true, 00:27:24.149 "get_zone_info": false, 00:27:24.149 "zone_management": false, 00:27:24.149 "zone_append": false, 00:27:24.149 "compare": false, 00:27:24.149 "compare_and_write": false, 00:27:24.149 "abort": true, 00:27:24.149 "seek_hole": false, 00:27:24.149 "seek_data": false, 00:27:24.149 "copy": true, 00:27:24.149 "nvme_iov_md": false 00:27:24.149 }, 00:27:24.149 "memory_domains": [ 00:27:24.149 { 00:27:24.149 "dma_device_id": "system", 00:27:24.149 "dma_device_type": 1 00:27:24.149 }, 00:27:24.149 { 00:27:24.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:24.149 "dma_device_type": 2 00:27:24.149 } 00:27:24.149 ], 00:27:24.149 "driver_specific": { 00:27:24.149 "passthru": { 00:27:24.149 "name": "pt1", 00:27:24.149 "base_bdev_name": "malloc1" 00:27:24.149 } 00:27:24.149 } 00:27:24.149 }' 00:27:24.149 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:24.408 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:24.667 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:24.667 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:24.667 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:24.667 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:24.667 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:24.667 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:24.667 "name": "pt2", 00:27:24.667 "aliases": [ 00:27:24.667 "00000000-0000-0000-0000-000000000002" 00:27:24.667 ], 00:27:24.667 "product_name": "passthru", 00:27:24.667 "block_size": 4096, 00:27:24.667 "num_blocks": 8192, 00:27:24.667 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:24.667 "assigned_rate_limits": { 00:27:24.667 "rw_ios_per_sec": 0, 00:27:24.667 "rw_mbytes_per_sec": 0, 00:27:24.667 "r_mbytes_per_sec": 0, 00:27:24.667 "w_mbytes_per_sec": 0 00:27:24.667 }, 00:27:24.667 "claimed": true, 00:27:24.667 "claim_type": "exclusive_write", 00:27:24.667 "zoned": false, 00:27:24.667 "supported_io_types": { 00:27:24.667 "read": true, 00:27:24.667 "write": true, 00:27:24.667 "unmap": true, 00:27:24.667 "flush": true, 00:27:24.667 "reset": true, 00:27:24.667 "nvme_admin": false, 00:27:24.667 "nvme_io": false, 00:27:24.667 "nvme_io_md": false, 00:27:24.667 "write_zeroes": true, 00:27:24.667 "zcopy": true, 00:27:24.667 "get_zone_info": false, 00:27:24.667 "zone_management": false, 00:27:24.667 "zone_append": false, 00:27:24.667 "compare": false, 00:27:24.667 "compare_and_write": false, 00:27:24.667 "abort": true, 00:27:24.667 "seek_hole": false, 00:27:24.667 "seek_data": false, 00:27:24.667 "copy": true, 00:27:24.667 "nvme_iov_md": false 00:27:24.667 }, 00:27:24.667 "memory_domains": [ 00:27:24.667 { 00:27:24.667 "dma_device_id": "system", 00:27:24.667 "dma_device_type": 1 00:27:24.667 }, 00:27:24.667 { 00:27:24.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:24.667 "dma_device_type": 2 00:27:24.667 } 00:27:24.667 ], 00:27:24.667 "driver_specific": { 00:27:24.667 "passthru": { 00:27:24.667 "name": "pt2", 00:27:24.667 "base_bdev_name": "malloc2" 00:27:24.667 } 00:27:24.667 } 00:27:24.667 }' 00:27:24.667 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:24.926 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:24.926 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:24.926 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:24.926 10:35:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:24.926 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:27:24.926 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:24.926 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:24.926 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:27:24.926 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:24.926 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:25.184 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:27:25.184 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:25.184 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:25.443 [2024-07-15 10:35:02.414886] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:25.443 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' a685bf63-34b0-4bc2-b013-572912b0e7a9 '!=' a685bf63-34b0-4bc2-b013-572912b0e7a9 ']' 00:27:25.443 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:25.443 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:25.443 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:27:25.443 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:25.715 [2024-07-15 10:35:02.663333] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.715 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.974 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:25.974 "name": "raid_bdev1", 00:27:25.974 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:25.974 "strip_size_kb": 0, 00:27:25.974 "state": "online", 00:27:25.974 "raid_level": "raid1", 00:27:25.974 "superblock": true, 00:27:25.974 "num_base_bdevs": 2, 00:27:25.974 "num_base_bdevs_discovered": 1, 00:27:25.974 "num_base_bdevs_operational": 1, 00:27:25.974 "base_bdevs_list": [ 00:27:25.974 { 00:27:25.974 "name": null, 00:27:25.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:25.974 "is_configured": false, 00:27:25.974 "data_offset": 256, 00:27:25.974 "data_size": 7936 00:27:25.974 }, 00:27:25.974 { 00:27:25.974 "name": "pt2", 00:27:25.974 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:25.974 "is_configured": true, 00:27:25.974 "data_offset": 256, 00:27:25.974 "data_size": 7936 00:27:25.974 } 00:27:25.974 ] 00:27:25.974 }' 00:27:25.974 10:35:02 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:25.974 10:35:02 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:26.538 10:35:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:26.796 [2024-07-15 10:35:03.754206] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:26.796 [2024-07-15 10:35:03.754234] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:26.796 [2024-07-15 10:35:03.754290] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:26.796 [2024-07-15 10:35:03.754333] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:26.796 [2024-07-15 10:35:03.754346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17aa590 name raid_bdev1, state offline 00:27:26.796 10:35:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.796 10:35:03 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:27.054 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:27.054 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:27.054 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:27.054 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:27.054 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:27.312 [2024-07-15 10:35:04.480113] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:27.312 [2024-07-15 10:35:04.480164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.312 [2024-07-15 10:35:04.480182] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1612160 00:27:27.312 [2024-07-15 10:35:04.480195] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.312 [2024-07-15 10:35:04.481847] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.312 [2024-07-15 10:35:04.481878] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:27.312 [2024-07-15 10:35:04.481958] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:27.312 [2024-07-15 10:35:04.481988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:27.312 [2024-07-15 10:35:04.482077] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1608380 00:27:27.312 [2024-07-15 10:35:04.482088] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:27.312 [2024-07-15 10:35:04.482259] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1609a80 00:27:27.312 [2024-07-15 10:35:04.482382] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1608380 00:27:27.312 [2024-07-15 10:35:04.482392] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1608380 00:27:27.312 [2024-07-15 10:35:04.482488] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:27.312 pt2 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.312 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.571 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:27.571 "name": "raid_bdev1", 00:27:27.571 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:27.571 "strip_size_kb": 0, 00:27:27.571 "state": "online", 00:27:27.571 "raid_level": "raid1", 00:27:27.571 "superblock": true, 00:27:27.571 "num_base_bdevs": 2, 00:27:27.571 "num_base_bdevs_discovered": 1, 00:27:27.571 "num_base_bdevs_operational": 1, 00:27:27.571 "base_bdevs_list": [ 00:27:27.571 { 00:27:27.571 "name": null, 00:27:27.571 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.571 "is_configured": false, 00:27:27.571 "data_offset": 256, 00:27:27.571 "data_size": 7936 00:27:27.571 }, 00:27:27.571 { 00:27:27.571 "name": "pt2", 00:27:27.571 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:27.571 "is_configured": true, 00:27:27.571 "data_offset": 256, 00:27:27.571 "data_size": 7936 00:27:27.571 } 00:27:27.571 ] 00:27:27.571 }' 00:27:27.571 10:35:04 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:27.571 10:35:04 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:28.136 10:35:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:28.394 [2024-07-15 10:35:05.486938] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:28.394 [2024-07-15 10:35:05.486967] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:28.394 [2024-07-15 10:35:05.487023] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:28.394 [2024-07-15 10:35:05.487068] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:28.394 [2024-07-15 10:35:05.487080] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1608380 name raid_bdev1, state offline 00:27:28.394 10:35:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:28.394 10:35:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.653 10:35:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:28.653 10:35:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:28.653 10:35:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:28.653 10:35:05 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:28.911 [2024-07-15 10:35:05.984226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:28.911 [2024-07-15 10:35:05.984274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:28.911 [2024-07-15 10:35:05.984292] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b4520 00:27:28.911 [2024-07-15 10:35:05.984305] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:28.911 [2024-07-15 10:35:05.985893] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:28.911 [2024-07-15 10:35:05.985922] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:28.911 [2024-07-15 10:35:05.986000] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:28.911 [2024-07-15 10:35:05.986026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:28.911 [2024-07-15 10:35:05.986120] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:28.911 [2024-07-15 10:35:05.986133] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:28.911 [2024-07-15 10:35:05.986147] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16093f0 name raid_bdev1, state configuring 00:27:28.911 [2024-07-15 10:35:05.986177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:28.911 [2024-07-15 10:35:05.986235] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x160b2b0 00:27:28.911 [2024-07-15 10:35:05.986246] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:28.911 [2024-07-15 10:35:05.986402] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1608350 00:27:28.911 [2024-07-15 10:35:05.986522] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x160b2b0 00:27:28.911 [2024-07-15 10:35:05.986532] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x160b2b0 00:27:28.911 [2024-07-15 10:35:05.986629] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:28.911 pt1 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.911 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.912 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.170 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.170 "name": "raid_bdev1", 00:27:29.170 "uuid": "a685bf63-34b0-4bc2-b013-572912b0e7a9", 00:27:29.170 "strip_size_kb": 0, 00:27:29.170 "state": "online", 00:27:29.170 "raid_level": "raid1", 00:27:29.170 "superblock": true, 00:27:29.170 "num_base_bdevs": 2, 00:27:29.170 "num_base_bdevs_discovered": 1, 00:27:29.170 "num_base_bdevs_operational": 1, 00:27:29.170 "base_bdevs_list": [ 00:27:29.170 { 00:27:29.170 "name": null, 00:27:29.170 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.170 "is_configured": false, 00:27:29.170 "data_offset": 256, 00:27:29.170 "data_size": 7936 00:27:29.170 }, 00:27:29.170 { 00:27:29.170 "name": "pt2", 00:27:29.170 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:29.170 "is_configured": true, 00:27:29.170 "data_offset": 256, 00:27:29.170 "data_size": 7936 00:27:29.170 } 00:27:29.170 ] 00:27:29.170 }' 00:27:29.170 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.170 10:35:06 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:29.735 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:29.735 10:35:06 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:29.994 10:35:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:29.994 10:35:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:29.994 10:35:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:30.253 [2024-07-15 10:35:07.271854] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' a685bf63-34b0-4bc2-b013-572912b0e7a9 '!=' a685bf63-34b0-4bc2-b013-572912b0e7a9 ']' 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 614006 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 614006 ']' 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 614006 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 614006 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 614006' 00:27:30.253 killing process with pid 614006 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 614006 00:27:30.253 [2024-07-15 10:35:07.338140] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:30.253 [2024-07-15 10:35:07.338193] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:30.253 [2024-07-15 10:35:07.338234] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:30.253 [2024-07-15 10:35:07.338246] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x160b2b0 name raid_bdev1, state offline 00:27:30.253 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 614006 00:27:30.253 [2024-07-15 10:35:07.354231] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:30.512 10:35:07 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:27:30.512 00:27:30.512 real 0m15.460s 00:27:30.512 user 0m28.036s 00:27:30.512 sys 0m2.867s 00:27:30.512 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:30.512 10:35:07 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:27:30.512 ************************************ 00:27:30.512 END TEST raid_superblock_test_4k 00:27:30.512 ************************************ 00:27:30.512 10:35:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:30.512 10:35:07 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:27:30.512 10:35:07 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:27:30.512 10:35:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:30.512 10:35:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:30.512 10:35:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:30.512 ************************************ 00:27:30.512 START TEST raid_rebuild_test_sb_4k 00:27:30.512 ************************************ 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=616262 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 616262 /var/tmp/spdk-raid.sock 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 616262 ']' 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:30.512 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:30.512 10:35:07 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:30.771 [2024-07-15 10:35:07.717341] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:27:30.771 [2024-07-15 10:35:07.717407] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid616262 ] 00:27:30.771 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:30.771 Zero copy mechanism will not be used. 00:27:30.771 [2024-07-15 10:35:07.846101] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:30.771 [2024-07-15 10:35:07.953216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:31.030 [2024-07-15 10:35:08.017741] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:31.030 [2024-07-15 10:35:08.017797] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:31.597 10:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:31.597 10:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:27:31.597 10:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:31.597 10:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:27:31.854 BaseBdev1_malloc 00:27:31.855 10:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:32.124 [2024-07-15 10:35:09.121000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:32.124 [2024-07-15 10:35:09.121050] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.124 [2024-07-15 10:35:09.121076] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa75d40 00:27:32.124 [2024-07-15 10:35:09.121095] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.124 [2024-07-15 10:35:09.122888] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.124 [2024-07-15 10:35:09.122919] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:32.124 BaseBdev1 00:27:32.124 10:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:32.124 10:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:27:32.390 BaseBdev2_malloc 00:27:32.390 10:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:32.647 [2024-07-15 10:35:09.631213] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:32.647 [2024-07-15 10:35:09.631264] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.647 [2024-07-15 10:35:09.631289] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa76860 00:27:32.647 [2024-07-15 10:35:09.631302] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.647 [2024-07-15 10:35:09.632903] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.647 [2024-07-15 10:35:09.632939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:32.647 BaseBdev2 00:27:32.647 10:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:27:32.906 spare_malloc 00:27:32.906 10:35:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:33.173 spare_delay 00:27:33.173 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:33.440 [2024-07-15 10:35:10.387034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:33.440 [2024-07-15 10:35:10.387080] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:33.440 [2024-07-15 10:35:10.387103] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc24ec0 00:27:33.440 [2024-07-15 10:35:10.387116] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:33.440 [2024-07-15 10:35:10.388733] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:33.440 [2024-07-15 10:35:10.388762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:33.440 spare 00:27:33.440 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:33.440 [2024-07-15 10:35:10.627706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:33.440 [2024-07-15 10:35:10.629037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:33.440 [2024-07-15 10:35:10.629203] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc26070 00:27:33.440 [2024-07-15 10:35:10.629216] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:33.440 [2024-07-15 10:35:10.629413] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc1f490 00:27:33.440 [2024-07-15 10:35:10.629554] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc26070 00:27:33.440 [2024-07-15 10:35:10.629564] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc26070 00:27:33.440 [2024-07-15 10:35:10.629664] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.699 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.957 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:33.957 "name": "raid_bdev1", 00:27:33.957 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:33.957 "strip_size_kb": 0, 00:27:33.957 "state": "online", 00:27:33.957 "raid_level": "raid1", 00:27:33.957 "superblock": true, 00:27:33.957 "num_base_bdevs": 2, 00:27:33.957 "num_base_bdevs_discovered": 2, 00:27:33.957 "num_base_bdevs_operational": 2, 00:27:33.957 "base_bdevs_list": [ 00:27:33.957 { 00:27:33.957 "name": "BaseBdev1", 00:27:33.957 "uuid": "7110245e-bef4-55ab-a316-4409a77f2568", 00:27:33.957 "is_configured": true, 00:27:33.957 "data_offset": 256, 00:27:33.957 "data_size": 7936 00:27:33.957 }, 00:27:33.957 { 00:27:33.957 "name": "BaseBdev2", 00:27:33.957 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:33.957 "is_configured": true, 00:27:33.957 "data_offset": 256, 00:27:33.957 "data_size": 7936 00:27:33.957 } 00:27:33.957 ] 00:27:33.957 }' 00:27:33.957 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:33.957 10:35:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:34.524 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:34.524 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:34.524 [2024-07-15 10:35:11.718822] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:34.784 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:34.784 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.784 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:35.043 10:35:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:35.043 [2024-07-15 10:35:12.215921] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc1f490 00:27:35.043 /dev/nbd0 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:35.302 1+0 records in 00:27:35.302 1+0 records out 00:27:35.302 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241692 s, 16.9 MB/s 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:35.302 10:35:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:35.867 7936+0 records in 00:27:35.867 7936+0 records out 00:27:35.867 32505856 bytes (33 MB, 31 MiB) copied, 0.740244 s, 43.9 MB/s 00:27:35.867 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:35.867 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:35.867 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:35.867 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:35.867 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:35.867 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:35.867 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:36.125 [2024-07-15 10:35:13.297117] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:36.125 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:36.381 [2024-07-15 10:35:13.533791] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.381 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.638 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.638 "name": "raid_bdev1", 00:27:36.638 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:36.638 "strip_size_kb": 0, 00:27:36.638 "state": "online", 00:27:36.638 "raid_level": "raid1", 00:27:36.638 "superblock": true, 00:27:36.638 "num_base_bdevs": 2, 00:27:36.638 "num_base_bdevs_discovered": 1, 00:27:36.638 "num_base_bdevs_operational": 1, 00:27:36.638 "base_bdevs_list": [ 00:27:36.638 { 00:27:36.638 "name": null, 00:27:36.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.638 "is_configured": false, 00:27:36.638 "data_offset": 256, 00:27:36.638 "data_size": 7936 00:27:36.638 }, 00:27:36.638 { 00:27:36.638 "name": "BaseBdev2", 00:27:36.638 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:36.638 "is_configured": true, 00:27:36.638 "data_offset": 256, 00:27:36.638 "data_size": 7936 00:27:36.638 } 00:27:36.638 ] 00:27:36.638 }' 00:27:36.638 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.638 10:35:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:37.570 10:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:37.570 [2024-07-15 10:35:14.628715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:37.570 [2024-07-15 10:35:14.633647] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc25ce0 00:27:37.570 [2024-07-15 10:35:14.635859] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:37.570 10:35:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:38.502 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:38.502 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:38.502 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:38.502 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:38.502 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:38.502 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.502 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.760 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:38.760 "name": "raid_bdev1", 00:27:38.760 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:38.760 "strip_size_kb": 0, 00:27:38.760 "state": "online", 00:27:38.760 "raid_level": "raid1", 00:27:38.760 "superblock": true, 00:27:38.760 "num_base_bdevs": 2, 00:27:38.760 "num_base_bdevs_discovered": 2, 00:27:38.760 "num_base_bdevs_operational": 2, 00:27:38.760 "process": { 00:27:38.760 "type": "rebuild", 00:27:38.760 "target": "spare", 00:27:38.760 "progress": { 00:27:38.760 "blocks": 3072, 00:27:38.760 "percent": 38 00:27:38.760 } 00:27:38.760 }, 00:27:38.760 "base_bdevs_list": [ 00:27:38.760 { 00:27:38.760 "name": "spare", 00:27:38.760 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:38.760 "is_configured": true, 00:27:38.760 "data_offset": 256, 00:27:38.760 "data_size": 7936 00:27:38.760 }, 00:27:38.760 { 00:27:38.760 "name": "BaseBdev2", 00:27:38.760 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:38.760 "is_configured": true, 00:27:38.760 "data_offset": 256, 00:27:38.760 "data_size": 7936 00:27:38.760 } 00:27:38.760 ] 00:27:38.760 }' 00:27:38.760 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:38.760 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:38.760 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:39.017 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:39.017 10:35:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:39.275 [2024-07-15 10:35:16.218578] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:39.275 [2024-07-15 10:35:16.248530] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:39.275 [2024-07-15 10:35:16.248574] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:39.275 [2024-07-15 10:35:16.248590] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:39.275 [2024-07-15 10:35:16.248598] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.275 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.532 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:39.532 "name": "raid_bdev1", 00:27:39.533 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:39.533 "strip_size_kb": 0, 00:27:39.533 "state": "online", 00:27:39.533 "raid_level": "raid1", 00:27:39.533 "superblock": true, 00:27:39.533 "num_base_bdevs": 2, 00:27:39.533 "num_base_bdevs_discovered": 1, 00:27:39.533 "num_base_bdevs_operational": 1, 00:27:39.533 "base_bdevs_list": [ 00:27:39.533 { 00:27:39.533 "name": null, 00:27:39.533 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.533 "is_configured": false, 00:27:39.533 "data_offset": 256, 00:27:39.533 "data_size": 7936 00:27:39.533 }, 00:27:39.533 { 00:27:39.533 "name": "BaseBdev2", 00:27:39.533 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:39.533 "is_configured": true, 00:27:39.533 "data_offset": 256, 00:27:39.533 "data_size": 7936 00:27:39.533 } 00:27:39.533 ] 00:27:39.533 }' 00:27:39.533 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:39.533 10:35:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:40.097 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:40.097 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:40.097 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:40.097 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:40.097 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:40.097 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.097 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.355 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:40.355 "name": "raid_bdev1", 00:27:40.355 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:40.355 "strip_size_kb": 0, 00:27:40.355 "state": "online", 00:27:40.355 "raid_level": "raid1", 00:27:40.355 "superblock": true, 00:27:40.355 "num_base_bdevs": 2, 00:27:40.355 "num_base_bdevs_discovered": 1, 00:27:40.355 "num_base_bdevs_operational": 1, 00:27:40.355 "base_bdevs_list": [ 00:27:40.355 { 00:27:40.355 "name": null, 00:27:40.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:40.355 "is_configured": false, 00:27:40.355 "data_offset": 256, 00:27:40.355 "data_size": 7936 00:27:40.355 }, 00:27:40.355 { 00:27:40.355 "name": "BaseBdev2", 00:27:40.355 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:40.355 "is_configured": true, 00:27:40.355 "data_offset": 256, 00:27:40.355 "data_size": 7936 00:27:40.355 } 00:27:40.355 ] 00:27:40.355 }' 00:27:40.355 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:40.355 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:40.355 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:40.355 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:40.355 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:40.612 [2024-07-15 10:35:17.636717] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:40.612 [2024-07-15 10:35:17.642315] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc25ce0 00:27:40.612 [2024-07-15 10:35:17.643820] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:40.612 10:35:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:41.543 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:41.543 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:41.543 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:41.543 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:41.543 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:41.543 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.543 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.801 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:41.801 "name": "raid_bdev1", 00:27:41.801 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:41.801 "strip_size_kb": 0, 00:27:41.801 "state": "online", 00:27:41.801 "raid_level": "raid1", 00:27:41.801 "superblock": true, 00:27:41.801 "num_base_bdevs": 2, 00:27:41.801 "num_base_bdevs_discovered": 2, 00:27:41.801 "num_base_bdevs_operational": 2, 00:27:41.801 "process": { 00:27:41.801 "type": "rebuild", 00:27:41.801 "target": "spare", 00:27:41.801 "progress": { 00:27:41.801 "blocks": 3072, 00:27:41.801 "percent": 38 00:27:41.801 } 00:27:41.801 }, 00:27:41.801 "base_bdevs_list": [ 00:27:41.801 { 00:27:41.801 "name": "spare", 00:27:41.801 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:41.801 "is_configured": true, 00:27:41.801 "data_offset": 256, 00:27:41.801 "data_size": 7936 00:27:41.801 }, 00:27:41.801 { 00:27:41.801 "name": "BaseBdev2", 00:27:41.801 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:41.801 "is_configured": true, 00:27:41.801 "data_offset": 256, 00:27:41.801 "data_size": 7936 00:27:41.801 } 00:27:41.801 ] 00:27:41.801 }' 00:27:41.801 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:41.801 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:41.801 10:35:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:42.058 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:42.059 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1004 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.059 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.316 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:42.316 "name": "raid_bdev1", 00:27:42.316 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:42.316 "strip_size_kb": 0, 00:27:42.316 "state": "online", 00:27:42.316 "raid_level": "raid1", 00:27:42.316 "superblock": true, 00:27:42.316 "num_base_bdevs": 2, 00:27:42.316 "num_base_bdevs_discovered": 2, 00:27:42.316 "num_base_bdevs_operational": 2, 00:27:42.316 "process": { 00:27:42.316 "type": "rebuild", 00:27:42.316 "target": "spare", 00:27:42.316 "progress": { 00:27:42.316 "blocks": 3840, 00:27:42.316 "percent": 48 00:27:42.316 } 00:27:42.316 }, 00:27:42.316 "base_bdevs_list": [ 00:27:42.316 { 00:27:42.316 "name": "spare", 00:27:42.316 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:42.316 "is_configured": true, 00:27:42.316 "data_offset": 256, 00:27:42.316 "data_size": 7936 00:27:42.316 }, 00:27:42.316 { 00:27:42.316 "name": "BaseBdev2", 00:27:42.316 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:42.316 "is_configured": true, 00:27:42.316 "data_offset": 256, 00:27:42.316 "data_size": 7936 00:27:42.316 } 00:27:42.316 ] 00:27:42.316 }' 00:27:42.316 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:42.316 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:42.316 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:42.316 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:42.316 10:35:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.247 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.505 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:43.505 "name": "raid_bdev1", 00:27:43.505 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:43.505 "strip_size_kb": 0, 00:27:43.505 "state": "online", 00:27:43.505 "raid_level": "raid1", 00:27:43.505 "superblock": true, 00:27:43.505 "num_base_bdevs": 2, 00:27:43.505 "num_base_bdevs_discovered": 2, 00:27:43.505 "num_base_bdevs_operational": 2, 00:27:43.505 "process": { 00:27:43.505 "type": "rebuild", 00:27:43.505 "target": "spare", 00:27:43.505 "progress": { 00:27:43.505 "blocks": 7424, 00:27:43.505 "percent": 93 00:27:43.505 } 00:27:43.505 }, 00:27:43.505 "base_bdevs_list": [ 00:27:43.505 { 00:27:43.505 "name": "spare", 00:27:43.505 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:43.505 "is_configured": true, 00:27:43.505 "data_offset": 256, 00:27:43.505 "data_size": 7936 00:27:43.505 }, 00:27:43.505 { 00:27:43.505 "name": "BaseBdev2", 00:27:43.505 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:43.505 "is_configured": true, 00:27:43.505 "data_offset": 256, 00:27:43.505 "data_size": 7936 00:27:43.505 } 00:27:43.505 ] 00:27:43.505 }' 00:27:43.505 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:43.505 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:43.505 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:43.505 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:43.505 10:35:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:43.762 [2024-07-15 10:35:20.767649] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:43.762 [2024-07-15 10:35:20.767705] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:43.762 [2024-07-15 10:35:20.767788] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:44.692 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:44.692 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:44.692 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:44.692 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:44.692 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:44.692 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:44.693 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.693 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.949 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:44.949 "name": "raid_bdev1", 00:27:44.949 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:44.949 "strip_size_kb": 0, 00:27:44.949 "state": "online", 00:27:44.949 "raid_level": "raid1", 00:27:44.949 "superblock": true, 00:27:44.949 "num_base_bdevs": 2, 00:27:44.949 "num_base_bdevs_discovered": 2, 00:27:44.949 "num_base_bdevs_operational": 2, 00:27:44.949 "base_bdevs_list": [ 00:27:44.949 { 00:27:44.949 "name": "spare", 00:27:44.949 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:44.949 "is_configured": true, 00:27:44.949 "data_offset": 256, 00:27:44.949 "data_size": 7936 00:27:44.949 }, 00:27:44.949 { 00:27:44.949 "name": "BaseBdev2", 00:27:44.949 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:44.949 "is_configured": true, 00:27:44.949 "data_offset": 256, 00:27:44.949 "data_size": 7936 00:27:44.949 } 00:27:44.949 ] 00:27:44.949 }' 00:27:44.949 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:44.949 10:35:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.949 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:45.206 "name": "raid_bdev1", 00:27:45.206 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:45.206 "strip_size_kb": 0, 00:27:45.206 "state": "online", 00:27:45.206 "raid_level": "raid1", 00:27:45.206 "superblock": true, 00:27:45.206 "num_base_bdevs": 2, 00:27:45.206 "num_base_bdevs_discovered": 2, 00:27:45.206 "num_base_bdevs_operational": 2, 00:27:45.206 "base_bdevs_list": [ 00:27:45.206 { 00:27:45.206 "name": "spare", 00:27:45.206 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:45.206 "is_configured": true, 00:27:45.206 "data_offset": 256, 00:27:45.206 "data_size": 7936 00:27:45.206 }, 00:27:45.206 { 00:27:45.206 "name": "BaseBdev2", 00:27:45.206 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:45.206 "is_configured": true, 00:27:45.206 "data_offset": 256, 00:27:45.206 "data_size": 7936 00:27:45.206 } 00:27:45.206 ] 00:27:45.206 }' 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.206 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.463 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.463 "name": "raid_bdev1", 00:27:45.463 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:45.463 "strip_size_kb": 0, 00:27:45.463 "state": "online", 00:27:45.463 "raid_level": "raid1", 00:27:45.463 "superblock": true, 00:27:45.463 "num_base_bdevs": 2, 00:27:45.463 "num_base_bdevs_discovered": 2, 00:27:45.463 "num_base_bdevs_operational": 2, 00:27:45.463 "base_bdevs_list": [ 00:27:45.463 { 00:27:45.463 "name": "spare", 00:27:45.463 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:45.463 "is_configured": true, 00:27:45.463 "data_offset": 256, 00:27:45.463 "data_size": 7936 00:27:45.463 }, 00:27:45.463 { 00:27:45.463 "name": "BaseBdev2", 00:27:45.463 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:45.463 "is_configured": true, 00:27:45.463 "data_offset": 256, 00:27:45.463 "data_size": 7936 00:27:45.463 } 00:27:45.463 ] 00:27:45.463 }' 00:27:45.463 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.463 10:35:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:46.028 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:46.285 [2024-07-15 10:35:23.291639] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:46.285 [2024-07-15 10:35:23.291667] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:46.285 [2024-07-15 10:35:23.291727] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:46.285 [2024-07-15 10:35:23.291781] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:46.285 [2024-07-15 10:35:23.291793] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc26070 name raid_bdev1, state offline 00:27:46.285 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.285 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:46.541 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:46.542 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:27:46.542 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:46.542 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:46.542 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:46.798 /dev/nbd0 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:46.798 1+0 records in 00:27:46.798 1+0 records out 00:27:46.798 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251568 s, 16.3 MB/s 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:46.798 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:46.799 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:46.799 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:46.799 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:46.799 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:46.799 10:35:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:47.055 /dev/nbd1 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:47.055 1+0 records in 00:27:47.055 1+0 records out 00:27:47.055 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309714 s, 13.2 MB/s 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:47.055 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:47.311 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:47.311 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:47.311 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:47.311 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:47.311 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:47.311 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:47.311 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:27:47.567 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:27:47.567 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:47.567 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:47.567 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:47.567 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:47.567 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:47.567 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:27:47.826 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:27:47.827 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:47.827 10:35:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:48.085 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:48.342 [2024-07-15 10:35:25.298320] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:48.342 [2024-07-15 10:35:25.298365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:48.342 [2024-07-15 10:35:25.298386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc25500 00:27:48.342 [2024-07-15 10:35:25.298398] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:48.342 [2024-07-15 10:35:25.300017] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:48.342 [2024-07-15 10:35:25.300048] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:48.342 [2024-07-15 10:35:25.300131] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:48.342 [2024-07-15 10:35:25.300160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:48.342 [2024-07-15 10:35:25.300256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:48.342 spare 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.342 [2024-07-15 10:35:25.400568] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc24260 00:27:48.342 [2024-07-15 10:35:25.400585] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:48.342 [2024-07-15 10:35:25.400782] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc1f490 00:27:48.342 [2024-07-15 10:35:25.400933] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc24260 00:27:48.342 [2024-07-15 10:35:25.400944] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc24260 00:27:48.342 [2024-07-15 10:35:25.401048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:48.342 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:48.342 "name": "raid_bdev1", 00:27:48.342 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:48.342 "strip_size_kb": 0, 00:27:48.342 "state": "online", 00:27:48.342 "raid_level": "raid1", 00:27:48.342 "superblock": true, 00:27:48.342 "num_base_bdevs": 2, 00:27:48.342 "num_base_bdevs_discovered": 2, 00:27:48.342 "num_base_bdevs_operational": 2, 00:27:48.342 "base_bdevs_list": [ 00:27:48.342 { 00:27:48.342 "name": "spare", 00:27:48.342 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:48.342 "is_configured": true, 00:27:48.342 "data_offset": 256, 00:27:48.342 "data_size": 7936 00:27:48.343 }, 00:27:48.343 { 00:27:48.343 "name": "BaseBdev2", 00:27:48.343 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:48.343 "is_configured": true, 00:27:48.343 "data_offset": 256, 00:27:48.343 "data_size": 7936 00:27:48.343 } 00:27:48.343 ] 00:27:48.343 }' 00:27:48.343 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:48.343 10:35:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:48.906 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:48.906 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:48.906 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:48.906 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:48.906 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:48.906 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.906 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.163 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:49.163 "name": "raid_bdev1", 00:27:49.163 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:49.163 "strip_size_kb": 0, 00:27:49.163 "state": "online", 00:27:49.163 "raid_level": "raid1", 00:27:49.163 "superblock": true, 00:27:49.163 "num_base_bdevs": 2, 00:27:49.163 "num_base_bdevs_discovered": 2, 00:27:49.163 "num_base_bdevs_operational": 2, 00:27:49.163 "base_bdevs_list": [ 00:27:49.163 { 00:27:49.163 "name": "spare", 00:27:49.163 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:49.163 "is_configured": true, 00:27:49.163 "data_offset": 256, 00:27:49.163 "data_size": 7936 00:27:49.163 }, 00:27:49.163 { 00:27:49.163 "name": "BaseBdev2", 00:27:49.163 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:49.163 "is_configured": true, 00:27:49.163 "data_offset": 256, 00:27:49.163 "data_size": 7936 00:27:49.163 } 00:27:49.163 ] 00:27:49.163 }' 00:27:49.163 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:49.421 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:49.421 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:49.421 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:49.421 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:49.421 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.678 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:49.678 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:49.935 [2024-07-15 10:35:26.902683] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.935 10:35:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.192 10:35:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.192 "name": "raid_bdev1", 00:27:50.192 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:50.192 "strip_size_kb": 0, 00:27:50.192 "state": "online", 00:27:50.192 "raid_level": "raid1", 00:27:50.192 "superblock": true, 00:27:50.192 "num_base_bdevs": 2, 00:27:50.192 "num_base_bdevs_discovered": 1, 00:27:50.192 "num_base_bdevs_operational": 1, 00:27:50.192 "base_bdevs_list": [ 00:27:50.192 { 00:27:50.192 "name": null, 00:27:50.192 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.192 "is_configured": false, 00:27:50.192 "data_offset": 256, 00:27:50.192 "data_size": 7936 00:27:50.192 }, 00:27:50.192 { 00:27:50.192 "name": "BaseBdev2", 00:27:50.192 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:50.192 "is_configured": true, 00:27:50.192 "data_offset": 256, 00:27:50.192 "data_size": 7936 00:27:50.192 } 00:27:50.192 ] 00:27:50.192 }' 00:27:50.192 10:35:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.193 10:35:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:50.757 10:35:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:51.013 [2024-07-15 10:35:27.985568] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:51.013 [2024-07-15 10:35:27.985715] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:51.013 [2024-07-15 10:35:27.985730] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:51.013 [2024-07-15 10:35:27.985758] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:51.013 [2024-07-15 10:35:27.990572] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc1f490 00:27:51.013 [2024-07-15 10:35:27.992934] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:51.013 10:35:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:51.940 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:51.940 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:51.940 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:51.940 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:51.940 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:51.941 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.941 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.196 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:52.196 "name": "raid_bdev1", 00:27:52.196 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:52.196 "strip_size_kb": 0, 00:27:52.196 "state": "online", 00:27:52.196 "raid_level": "raid1", 00:27:52.196 "superblock": true, 00:27:52.196 "num_base_bdevs": 2, 00:27:52.196 "num_base_bdevs_discovered": 2, 00:27:52.196 "num_base_bdevs_operational": 2, 00:27:52.196 "process": { 00:27:52.196 "type": "rebuild", 00:27:52.196 "target": "spare", 00:27:52.196 "progress": { 00:27:52.196 "blocks": 3072, 00:27:52.196 "percent": 38 00:27:52.196 } 00:27:52.196 }, 00:27:52.196 "base_bdevs_list": [ 00:27:52.196 { 00:27:52.196 "name": "spare", 00:27:52.196 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:52.196 "is_configured": true, 00:27:52.196 "data_offset": 256, 00:27:52.196 "data_size": 7936 00:27:52.196 }, 00:27:52.196 { 00:27:52.196 "name": "BaseBdev2", 00:27:52.196 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:52.196 "is_configured": true, 00:27:52.196 "data_offset": 256, 00:27:52.196 "data_size": 7936 00:27:52.196 } 00:27:52.196 ] 00:27:52.196 }' 00:27:52.196 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:52.196 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:52.196 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:52.196 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:52.196 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:52.453 [2024-07-15 10:35:29.583620] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:52.453 [2024-07-15 10:35:29.605406] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:52.453 [2024-07-15 10:35:29.605447] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:52.453 [2024-07-15 10:35:29.605462] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:52.453 [2024-07-15 10:35:29.605470] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.453 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.711 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.711 "name": "raid_bdev1", 00:27:52.711 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:52.711 "strip_size_kb": 0, 00:27:52.711 "state": "online", 00:27:52.711 "raid_level": "raid1", 00:27:52.711 "superblock": true, 00:27:52.711 "num_base_bdevs": 2, 00:27:52.711 "num_base_bdevs_discovered": 1, 00:27:52.711 "num_base_bdevs_operational": 1, 00:27:52.711 "base_bdevs_list": [ 00:27:52.711 { 00:27:52.711 "name": null, 00:27:52.711 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:52.711 "is_configured": false, 00:27:52.711 "data_offset": 256, 00:27:52.711 "data_size": 7936 00:27:52.711 }, 00:27:52.711 { 00:27:52.711 "name": "BaseBdev2", 00:27:52.711 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:52.711 "is_configured": true, 00:27:52.711 "data_offset": 256, 00:27:52.711 "data_size": 7936 00:27:52.711 } 00:27:52.711 ] 00:27:52.711 }' 00:27:52.711 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.711 10:35:29 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:53.642 10:35:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:53.642 [2024-07-15 10:35:30.721122] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:53.642 [2024-07-15 10:35:30.721173] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:53.642 [2024-07-15 10:35:30.721196] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc25730 00:27:53.642 [2024-07-15 10:35:30.721209] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:53.642 [2024-07-15 10:35:30.721575] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:53.642 [2024-07-15 10:35:30.721593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:53.642 [2024-07-15 10:35:30.721672] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:53.642 [2024-07-15 10:35:30.721683] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:53.642 [2024-07-15 10:35:30.721694] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:53.642 [2024-07-15 10:35:30.721713] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:53.642 [2024-07-15 10:35:30.726596] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc26aa0 00:27:53.642 spare 00:27:53.642 [2024-07-15 10:35:30.728069] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:53.642 10:35:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:54.571 10:35:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:54.571 10:35:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:54.571 10:35:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:54.571 10:35:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:54.571 10:35:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:54.571 10:35:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.571 10:35:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.828 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:54.828 "name": "raid_bdev1", 00:27:54.828 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:54.828 "strip_size_kb": 0, 00:27:54.828 "state": "online", 00:27:54.828 "raid_level": "raid1", 00:27:54.828 "superblock": true, 00:27:54.828 "num_base_bdevs": 2, 00:27:54.828 "num_base_bdevs_discovered": 2, 00:27:54.828 "num_base_bdevs_operational": 2, 00:27:54.828 "process": { 00:27:54.828 "type": "rebuild", 00:27:54.828 "target": "spare", 00:27:54.828 "progress": { 00:27:54.828 "blocks": 3072, 00:27:54.828 "percent": 38 00:27:54.828 } 00:27:54.828 }, 00:27:54.828 "base_bdevs_list": [ 00:27:54.828 { 00:27:54.828 "name": "spare", 00:27:54.828 "uuid": "0ba54ab8-9fc6-5709-a6de-785e2a970fa2", 00:27:54.828 "is_configured": true, 00:27:54.828 "data_offset": 256, 00:27:54.828 "data_size": 7936 00:27:54.828 }, 00:27:54.828 { 00:27:54.828 "name": "BaseBdev2", 00:27:54.828 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:54.828 "is_configured": true, 00:27:54.828 "data_offset": 256, 00:27:54.828 "data_size": 7936 00:27:54.828 } 00:27:54.828 ] 00:27:54.828 }' 00:27:54.828 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:55.085 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:55.085 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:55.085 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:55.085 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:55.343 [2024-07-15 10:35:32.360400] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:55.343 [2024-07-15 10:35:32.441574] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:55.343 [2024-07-15 10:35:32.441616] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:55.343 [2024-07-15 10:35:32.441631] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:55.343 [2024-07-15 10:35:32.441640] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:55.343 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.601 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:55.601 "name": "raid_bdev1", 00:27:55.601 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:55.601 "strip_size_kb": 0, 00:27:55.601 "state": "online", 00:27:55.601 "raid_level": "raid1", 00:27:55.601 "superblock": true, 00:27:55.601 "num_base_bdevs": 2, 00:27:55.601 "num_base_bdevs_discovered": 1, 00:27:55.601 "num_base_bdevs_operational": 1, 00:27:55.601 "base_bdevs_list": [ 00:27:55.601 { 00:27:55.601 "name": null, 00:27:55.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.601 "is_configured": false, 00:27:55.601 "data_offset": 256, 00:27:55.601 "data_size": 7936 00:27:55.601 }, 00:27:55.601 { 00:27:55.601 "name": "BaseBdev2", 00:27:55.601 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:55.601 "is_configured": true, 00:27:55.601 "data_offset": 256, 00:27:55.601 "data_size": 7936 00:27:55.601 } 00:27:55.601 ] 00:27:55.601 }' 00:27:55.601 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:55.601 10:35:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:56.167 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:56.167 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:56.167 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:56.167 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:56.167 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:56.167 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.167 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.424 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:56.424 "name": "raid_bdev1", 00:27:56.424 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:56.424 "strip_size_kb": 0, 00:27:56.424 "state": "online", 00:27:56.424 "raid_level": "raid1", 00:27:56.424 "superblock": true, 00:27:56.424 "num_base_bdevs": 2, 00:27:56.424 "num_base_bdevs_discovered": 1, 00:27:56.424 "num_base_bdevs_operational": 1, 00:27:56.424 "base_bdevs_list": [ 00:27:56.424 { 00:27:56.424 "name": null, 00:27:56.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:56.424 "is_configured": false, 00:27:56.424 "data_offset": 256, 00:27:56.424 "data_size": 7936 00:27:56.424 }, 00:27:56.424 { 00:27:56.424 "name": "BaseBdev2", 00:27:56.424 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:56.424 "is_configured": true, 00:27:56.424 "data_offset": 256, 00:27:56.424 "data_size": 7936 00:27:56.424 } 00:27:56.424 ] 00:27:56.424 }' 00:27:56.424 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:56.424 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:56.424 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:56.683 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:56.683 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:56.940 10:35:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:56.940 [2024-07-15 10:35:34.119037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:56.940 [2024-07-15 10:35:34.119081] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:56.940 [2024-07-15 10:35:34.119101] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc1fda0 00:27:56.940 [2024-07-15 10:35:34.119114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:56.940 [2024-07-15 10:35:34.119452] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:56.940 [2024-07-15 10:35:34.119469] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:56.940 [2024-07-15 10:35:34.119530] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:56.940 [2024-07-15 10:35:34.119542] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:56.940 [2024-07-15 10:35:34.119553] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:56.940 BaseBdev1 00:27:57.198 10:35:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.131 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.389 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:58.389 "name": "raid_bdev1", 00:27:58.389 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:58.389 "strip_size_kb": 0, 00:27:58.389 "state": "online", 00:27:58.389 "raid_level": "raid1", 00:27:58.389 "superblock": true, 00:27:58.389 "num_base_bdevs": 2, 00:27:58.389 "num_base_bdevs_discovered": 1, 00:27:58.389 "num_base_bdevs_operational": 1, 00:27:58.389 "base_bdevs_list": [ 00:27:58.389 { 00:27:58.389 "name": null, 00:27:58.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.389 "is_configured": false, 00:27:58.389 "data_offset": 256, 00:27:58.389 "data_size": 7936 00:27:58.389 }, 00:27:58.389 { 00:27:58.389 "name": "BaseBdev2", 00:27:58.389 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:58.389 "is_configured": true, 00:27:58.389 "data_offset": 256, 00:27:58.389 "data_size": 7936 00:27:58.389 } 00:27:58.389 ] 00:27:58.389 }' 00:27:58.389 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:58.389 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:27:58.955 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:58.955 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:58.955 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:58.955 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:58.955 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:58.955 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.955 10:35:35 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.214 "name": "raid_bdev1", 00:27:59.214 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:27:59.214 "strip_size_kb": 0, 00:27:59.214 "state": "online", 00:27:59.214 "raid_level": "raid1", 00:27:59.214 "superblock": true, 00:27:59.214 "num_base_bdevs": 2, 00:27:59.214 "num_base_bdevs_discovered": 1, 00:27:59.214 "num_base_bdevs_operational": 1, 00:27:59.214 "base_bdevs_list": [ 00:27:59.214 { 00:27:59.214 "name": null, 00:27:59.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:59.214 "is_configured": false, 00:27:59.214 "data_offset": 256, 00:27:59.214 "data_size": 7936 00:27:59.214 }, 00:27:59.214 { 00:27:59.214 "name": "BaseBdev2", 00:27:59.214 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:27:59.214 "is_configured": true, 00:27:59.214 "data_offset": 256, 00:27:59.214 "data_size": 7936 00:27:59.214 } 00:27:59.214 ] 00:27:59.214 }' 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:59.214 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:59.472 [2024-07-15 10:35:36.521423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:59.472 [2024-07-15 10:35:36.521539] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:59.472 [2024-07-15 10:35:36.521555] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:59.472 request: 00:27:59.472 { 00:27:59.472 "base_bdev": "BaseBdev1", 00:27:59.472 "raid_bdev": "raid_bdev1", 00:27:59.472 "method": "bdev_raid_add_base_bdev", 00:27:59.472 "req_id": 1 00:27:59.472 } 00:27:59.472 Got JSON-RPC error response 00:27:59.472 response: 00:27:59.472 { 00:27:59.472 "code": -22, 00:27:59.472 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:59.472 } 00:27:59.472 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:27:59.472 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:59.472 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:59.472 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:59.472 10:35:36 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.406 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.667 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:00.667 "name": "raid_bdev1", 00:28:00.667 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:28:00.667 "strip_size_kb": 0, 00:28:00.667 "state": "online", 00:28:00.667 "raid_level": "raid1", 00:28:00.667 "superblock": true, 00:28:00.667 "num_base_bdevs": 2, 00:28:00.667 "num_base_bdevs_discovered": 1, 00:28:00.667 "num_base_bdevs_operational": 1, 00:28:00.667 "base_bdevs_list": [ 00:28:00.667 { 00:28:00.667 "name": null, 00:28:00.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.667 "is_configured": false, 00:28:00.667 "data_offset": 256, 00:28:00.667 "data_size": 7936 00:28:00.667 }, 00:28:00.667 { 00:28:00.667 "name": "BaseBdev2", 00:28:00.667 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:28:00.667 "is_configured": true, 00:28:00.667 "data_offset": 256, 00:28:00.667 "data_size": 7936 00:28:00.667 } 00:28:00.667 ] 00:28:00.667 }' 00:28:00.667 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:00.667 10:35:37 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:01.233 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:01.233 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:01.233 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:01.233 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:01.233 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:01.233 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.233 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.492 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:01.492 "name": "raid_bdev1", 00:28:01.492 "uuid": "ba6dbe57-3fc4-4c63-b0e9-4a73884408ab", 00:28:01.492 "strip_size_kb": 0, 00:28:01.492 "state": "online", 00:28:01.492 "raid_level": "raid1", 00:28:01.492 "superblock": true, 00:28:01.492 "num_base_bdevs": 2, 00:28:01.492 "num_base_bdevs_discovered": 1, 00:28:01.492 "num_base_bdevs_operational": 1, 00:28:01.492 "base_bdevs_list": [ 00:28:01.492 { 00:28:01.492 "name": null, 00:28:01.492 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:01.492 "is_configured": false, 00:28:01.492 "data_offset": 256, 00:28:01.492 "data_size": 7936 00:28:01.492 }, 00:28:01.492 { 00:28:01.492 "name": "BaseBdev2", 00:28:01.492 "uuid": "4bda04d5-84be-510f-b867-6d99c1730124", 00:28:01.492 "is_configured": true, 00:28:01.492 "data_offset": 256, 00:28:01.492 "data_size": 7936 00:28:01.492 } 00:28:01.492 ] 00:28:01.492 }' 00:28:01.492 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:01.492 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:01.492 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 616262 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 616262 ']' 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 616262 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 616262 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 616262' 00:28:01.751 killing process with pid 616262 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 616262 00:28:01.751 Received shutdown signal, test time was about 60.000000 seconds 00:28:01.751 00:28:01.751 Latency(us) 00:28:01.751 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:01.751 =================================================================================================================== 00:28:01.751 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:01.751 [2024-07-15 10:35:38.782039] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:01.751 [2024-07-15 10:35:38.782138] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:01.751 [2024-07-15 10:35:38.782180] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:01.751 [2024-07-15 10:35:38.782199] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc24260 name raid_bdev1, state offline 00:28:01.751 10:35:38 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 616262 00:28:01.751 [2024-07-15 10:35:38.808445] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:02.010 10:35:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:28:02.010 00:28:02.010 real 0m31.371s 00:28:02.010 user 0m48.723s 00:28:02.010 sys 0m5.119s 00:28:02.010 10:35:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:02.010 10:35:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:02.010 ************************************ 00:28:02.010 END TEST raid_rebuild_test_sb_4k 00:28:02.010 ************************************ 00:28:02.010 10:35:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:02.010 10:35:39 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:28:02.010 10:35:39 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:28:02.010 10:35:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:02.010 10:35:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:02.010 10:35:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:02.010 ************************************ 00:28:02.010 START TEST raid_state_function_test_sb_md_separate 00:28:02.010 ************************************ 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=620769 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 620769' 00:28:02.010 Process raid pid: 620769 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 620769 /var/tmp/spdk-raid.sock 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 620769 ']' 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:02.010 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:02.010 10:35:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:02.010 [2024-07-15 10:35:39.175243] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:28:02.010 [2024-07-15 10:35:39.175308] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:02.269 [2024-07-15 10:35:39.305218] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:02.269 [2024-07-15 10:35:39.411624] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:02.529 [2024-07-15 10:35:39.475222] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:02.529 [2024-07-15 10:35:39.475246] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:03.156 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:03.156 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:03.156 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:03.156 [2024-07-15 10:35:40.317571] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:03.156 [2024-07-15 10:35:40.317613] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:03.156 [2024-07-15 10:35:40.317624] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:03.156 [2024-07-15 10:35:40.317635] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:03.156 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:03.156 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:03.156 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.157 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:03.721 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.721 "name": "Existed_Raid", 00:28:03.721 "uuid": "69dc280a-678d-454e-a903-fc2bfca1d1d7", 00:28:03.721 "strip_size_kb": 0, 00:28:03.721 "state": "configuring", 00:28:03.721 "raid_level": "raid1", 00:28:03.721 "superblock": true, 00:28:03.721 "num_base_bdevs": 2, 00:28:03.721 "num_base_bdevs_discovered": 0, 00:28:03.721 "num_base_bdevs_operational": 2, 00:28:03.721 "base_bdevs_list": [ 00:28:03.721 { 00:28:03.721 "name": "BaseBdev1", 00:28:03.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.721 "is_configured": false, 00:28:03.721 "data_offset": 0, 00:28:03.721 "data_size": 0 00:28:03.721 }, 00:28:03.721 { 00:28:03.721 "name": "BaseBdev2", 00:28:03.721 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.721 "is_configured": false, 00:28:03.721 "data_offset": 0, 00:28:03.721 "data_size": 0 00:28:03.721 } 00:28:03.721 ] 00:28:03.721 }' 00:28:03.721 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.721 10:35:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:04.390 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:04.390 [2024-07-15 10:35:41.520600] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:04.390 [2024-07-15 10:35:41.520632] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfea80 name Existed_Raid, state configuring 00:28:04.390 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:04.648 [2024-07-15 10:35:41.705111] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:04.648 [2024-07-15 10:35:41.705138] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:04.648 [2024-07-15 10:35:41.705148] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:04.648 [2024-07-15 10:35:41.705159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:04.648 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:28:04.905 [2024-07-15 10:35:41.883945] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:04.905 BaseBdev1 00:28:04.905 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:04.905 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:04.905 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:04.905 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:04.905 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:04.905 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:04.905 10:35:41 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:05.160 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:05.417 [ 00:28:05.417 { 00:28:05.417 "name": "BaseBdev1", 00:28:05.417 "aliases": [ 00:28:05.417 "62fe4751-c1e5-410f-a0d6-ea75f375e8e7" 00:28:05.417 ], 00:28:05.417 "product_name": "Malloc disk", 00:28:05.417 "block_size": 4096, 00:28:05.417 "num_blocks": 8192, 00:28:05.417 "uuid": "62fe4751-c1e5-410f-a0d6-ea75f375e8e7", 00:28:05.417 "md_size": 32, 00:28:05.417 "md_interleave": false, 00:28:05.417 "dif_type": 0, 00:28:05.417 "assigned_rate_limits": { 00:28:05.417 "rw_ios_per_sec": 0, 00:28:05.417 "rw_mbytes_per_sec": 0, 00:28:05.417 "r_mbytes_per_sec": 0, 00:28:05.417 "w_mbytes_per_sec": 0 00:28:05.417 }, 00:28:05.417 "claimed": true, 00:28:05.417 "claim_type": "exclusive_write", 00:28:05.417 "zoned": false, 00:28:05.417 "supported_io_types": { 00:28:05.417 "read": true, 00:28:05.417 "write": true, 00:28:05.417 "unmap": true, 00:28:05.417 "flush": true, 00:28:05.417 "reset": true, 00:28:05.417 "nvme_admin": false, 00:28:05.417 "nvme_io": false, 00:28:05.417 "nvme_io_md": false, 00:28:05.417 "write_zeroes": true, 00:28:05.417 "zcopy": true, 00:28:05.417 "get_zone_info": false, 00:28:05.417 "zone_management": false, 00:28:05.417 "zone_append": false, 00:28:05.417 "compare": false, 00:28:05.417 "compare_and_write": false, 00:28:05.417 "abort": true, 00:28:05.417 "seek_hole": false, 00:28:05.417 "seek_data": false, 00:28:05.417 "copy": true, 00:28:05.417 "nvme_iov_md": false 00:28:05.417 }, 00:28:05.417 "memory_domains": [ 00:28:05.417 { 00:28:05.417 "dma_device_id": "system", 00:28:05.417 "dma_device_type": 1 00:28:05.417 }, 00:28:05.417 { 00:28:05.417 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:05.417 "dma_device_type": 2 00:28:05.417 } 00:28:05.417 ], 00:28:05.417 "driver_specific": {} 00:28:05.417 } 00:28:05.417 ] 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.417 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:05.673 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:05.673 "name": "Existed_Raid", 00:28:05.673 "uuid": "3dfccbf5-7700-4492-9833-92129bceed57", 00:28:05.673 "strip_size_kb": 0, 00:28:05.673 "state": "configuring", 00:28:05.673 "raid_level": "raid1", 00:28:05.673 "superblock": true, 00:28:05.673 "num_base_bdevs": 2, 00:28:05.673 "num_base_bdevs_discovered": 1, 00:28:05.673 "num_base_bdevs_operational": 2, 00:28:05.673 "base_bdevs_list": [ 00:28:05.673 { 00:28:05.673 "name": "BaseBdev1", 00:28:05.673 "uuid": "62fe4751-c1e5-410f-a0d6-ea75f375e8e7", 00:28:05.673 "is_configured": true, 00:28:05.673 "data_offset": 256, 00:28:05.673 "data_size": 7936 00:28:05.673 }, 00:28:05.673 { 00:28:05.673 "name": "BaseBdev2", 00:28:05.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:05.673 "is_configured": false, 00:28:05.673 "data_offset": 0, 00:28:05.673 "data_size": 0 00:28:05.673 } 00:28:05.673 ] 00:28:05.673 }' 00:28:05.673 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:05.673 10:35:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:06.623 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:06.623 [2024-07-15 10:35:43.724832] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:06.623 [2024-07-15 10:35:43.724870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfe350 name Existed_Raid, state configuring 00:28:06.623 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:06.880 [2024-07-15 10:35:43.901335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:06.880 [2024-07-15 10:35:43.902791] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:06.880 [2024-07-15 10:35:43.902822] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:06.880 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:06.880 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.881 10:35:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:07.138 10:35:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.138 "name": "Existed_Raid", 00:28:07.138 "uuid": "e7340923-9a4b-4440-8f47-7a982938f039", 00:28:07.138 "strip_size_kb": 0, 00:28:07.138 "state": "configuring", 00:28:07.138 "raid_level": "raid1", 00:28:07.138 "superblock": true, 00:28:07.138 "num_base_bdevs": 2, 00:28:07.138 "num_base_bdevs_discovered": 1, 00:28:07.138 "num_base_bdevs_operational": 2, 00:28:07.138 "base_bdevs_list": [ 00:28:07.138 { 00:28:07.138 "name": "BaseBdev1", 00:28:07.138 "uuid": "62fe4751-c1e5-410f-a0d6-ea75f375e8e7", 00:28:07.138 "is_configured": true, 00:28:07.138 "data_offset": 256, 00:28:07.138 "data_size": 7936 00:28:07.138 }, 00:28:07.138 { 00:28:07.138 "name": "BaseBdev2", 00:28:07.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.138 "is_configured": false, 00:28:07.138 "data_offset": 0, 00:28:07.138 "data_size": 0 00:28:07.138 } 00:28:07.138 ] 00:28:07.138 }' 00:28:07.138 10:35:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.138 10:35:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:07.702 10:35:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:28:07.958 [2024-07-15 10:35:45.004382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:07.958 [2024-07-15 10:35:45.004528] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d00210 00:28:07.958 [2024-07-15 10:35:45.004542] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:07.958 [2024-07-15 10:35:45.004609] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cffc50 00:28:07.958 [2024-07-15 10:35:45.004705] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d00210 00:28:07.958 [2024-07-15 10:35:45.004715] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d00210 00:28:07.958 [2024-07-15 10:35:45.004779] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:07.958 BaseBdev2 00:28:07.958 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:07.958 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:07.958 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:07.958 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:28:07.958 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:07.958 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:07.958 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:08.215 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:08.473 [ 00:28:08.473 { 00:28:08.473 "name": "BaseBdev2", 00:28:08.473 "aliases": [ 00:28:08.473 "baf7bb4e-30b9-484f-925c-7ddfae5cc12a" 00:28:08.473 ], 00:28:08.473 "product_name": "Malloc disk", 00:28:08.473 "block_size": 4096, 00:28:08.473 "num_blocks": 8192, 00:28:08.473 "uuid": "baf7bb4e-30b9-484f-925c-7ddfae5cc12a", 00:28:08.473 "md_size": 32, 00:28:08.473 "md_interleave": false, 00:28:08.473 "dif_type": 0, 00:28:08.473 "assigned_rate_limits": { 00:28:08.473 "rw_ios_per_sec": 0, 00:28:08.473 "rw_mbytes_per_sec": 0, 00:28:08.473 "r_mbytes_per_sec": 0, 00:28:08.473 "w_mbytes_per_sec": 0 00:28:08.473 }, 00:28:08.473 "claimed": true, 00:28:08.473 "claim_type": "exclusive_write", 00:28:08.473 "zoned": false, 00:28:08.473 "supported_io_types": { 00:28:08.473 "read": true, 00:28:08.473 "write": true, 00:28:08.473 "unmap": true, 00:28:08.473 "flush": true, 00:28:08.473 "reset": true, 00:28:08.473 "nvme_admin": false, 00:28:08.473 "nvme_io": false, 00:28:08.473 "nvme_io_md": false, 00:28:08.473 "write_zeroes": true, 00:28:08.473 "zcopy": true, 00:28:08.473 "get_zone_info": false, 00:28:08.473 "zone_management": false, 00:28:08.473 "zone_append": false, 00:28:08.473 "compare": false, 00:28:08.473 "compare_and_write": false, 00:28:08.473 "abort": true, 00:28:08.473 "seek_hole": false, 00:28:08.473 "seek_data": false, 00:28:08.473 "copy": true, 00:28:08.473 "nvme_iov_md": false 00:28:08.473 }, 00:28:08.473 "memory_domains": [ 00:28:08.473 { 00:28:08.473 "dma_device_id": "system", 00:28:08.473 "dma_device_type": 1 00:28:08.473 }, 00:28:08.473 { 00:28:08.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:08.473 "dma_device_type": 2 00:28:08.473 } 00:28:08.473 ], 00:28:08.473 "driver_specific": {} 00:28:08.473 } 00:28:08.473 ] 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.473 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:08.731 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:08.731 "name": "Existed_Raid", 00:28:08.731 "uuid": "e7340923-9a4b-4440-8f47-7a982938f039", 00:28:08.731 "strip_size_kb": 0, 00:28:08.731 "state": "online", 00:28:08.731 "raid_level": "raid1", 00:28:08.731 "superblock": true, 00:28:08.731 "num_base_bdevs": 2, 00:28:08.731 "num_base_bdevs_discovered": 2, 00:28:08.731 "num_base_bdevs_operational": 2, 00:28:08.731 "base_bdevs_list": [ 00:28:08.731 { 00:28:08.731 "name": "BaseBdev1", 00:28:08.731 "uuid": "62fe4751-c1e5-410f-a0d6-ea75f375e8e7", 00:28:08.731 "is_configured": true, 00:28:08.731 "data_offset": 256, 00:28:08.731 "data_size": 7936 00:28:08.731 }, 00:28:08.731 { 00:28:08.731 "name": "BaseBdev2", 00:28:08.731 "uuid": "baf7bb4e-30b9-484f-925c-7ddfae5cc12a", 00:28:08.731 "is_configured": true, 00:28:08.731 "data_offset": 256, 00:28:08.731 "data_size": 7936 00:28:08.731 } 00:28:08.731 ] 00:28:08.731 }' 00:28:08.731 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:08.731 10:35:45 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:09.297 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:09.556 [2024-07-15 10:35:46.596906] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:09.556 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:09.556 "name": "Existed_Raid", 00:28:09.556 "aliases": [ 00:28:09.556 "e7340923-9a4b-4440-8f47-7a982938f039" 00:28:09.556 ], 00:28:09.556 "product_name": "Raid Volume", 00:28:09.556 "block_size": 4096, 00:28:09.556 "num_blocks": 7936, 00:28:09.556 "uuid": "e7340923-9a4b-4440-8f47-7a982938f039", 00:28:09.556 "md_size": 32, 00:28:09.556 "md_interleave": false, 00:28:09.556 "dif_type": 0, 00:28:09.556 "assigned_rate_limits": { 00:28:09.556 "rw_ios_per_sec": 0, 00:28:09.556 "rw_mbytes_per_sec": 0, 00:28:09.556 "r_mbytes_per_sec": 0, 00:28:09.556 "w_mbytes_per_sec": 0 00:28:09.556 }, 00:28:09.556 "claimed": false, 00:28:09.556 "zoned": false, 00:28:09.556 "supported_io_types": { 00:28:09.556 "read": true, 00:28:09.556 "write": true, 00:28:09.556 "unmap": false, 00:28:09.556 "flush": false, 00:28:09.556 "reset": true, 00:28:09.556 "nvme_admin": false, 00:28:09.556 "nvme_io": false, 00:28:09.556 "nvme_io_md": false, 00:28:09.556 "write_zeroes": true, 00:28:09.556 "zcopy": false, 00:28:09.556 "get_zone_info": false, 00:28:09.556 "zone_management": false, 00:28:09.556 "zone_append": false, 00:28:09.556 "compare": false, 00:28:09.556 "compare_and_write": false, 00:28:09.556 "abort": false, 00:28:09.556 "seek_hole": false, 00:28:09.556 "seek_data": false, 00:28:09.556 "copy": false, 00:28:09.556 "nvme_iov_md": false 00:28:09.556 }, 00:28:09.556 "memory_domains": [ 00:28:09.556 { 00:28:09.556 "dma_device_id": "system", 00:28:09.556 "dma_device_type": 1 00:28:09.556 }, 00:28:09.556 { 00:28:09.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.556 "dma_device_type": 2 00:28:09.556 }, 00:28:09.556 { 00:28:09.556 "dma_device_id": "system", 00:28:09.556 "dma_device_type": 1 00:28:09.556 }, 00:28:09.556 { 00:28:09.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.556 "dma_device_type": 2 00:28:09.556 } 00:28:09.556 ], 00:28:09.556 "driver_specific": { 00:28:09.556 "raid": { 00:28:09.556 "uuid": "e7340923-9a4b-4440-8f47-7a982938f039", 00:28:09.556 "strip_size_kb": 0, 00:28:09.556 "state": "online", 00:28:09.556 "raid_level": "raid1", 00:28:09.556 "superblock": true, 00:28:09.556 "num_base_bdevs": 2, 00:28:09.556 "num_base_bdevs_discovered": 2, 00:28:09.556 "num_base_bdevs_operational": 2, 00:28:09.556 "base_bdevs_list": [ 00:28:09.556 { 00:28:09.556 "name": "BaseBdev1", 00:28:09.556 "uuid": "62fe4751-c1e5-410f-a0d6-ea75f375e8e7", 00:28:09.556 "is_configured": true, 00:28:09.556 "data_offset": 256, 00:28:09.556 "data_size": 7936 00:28:09.556 }, 00:28:09.556 { 00:28:09.556 "name": "BaseBdev2", 00:28:09.556 "uuid": "baf7bb4e-30b9-484f-925c-7ddfae5cc12a", 00:28:09.556 "is_configured": true, 00:28:09.556 "data_offset": 256, 00:28:09.556 "data_size": 7936 00:28:09.556 } 00:28:09.556 ] 00:28:09.556 } 00:28:09.556 } 00:28:09.556 }' 00:28:09.556 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:09.556 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:09.556 BaseBdev2' 00:28:09.556 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:09.556 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:09.556 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:09.814 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:09.814 "name": "BaseBdev1", 00:28:09.814 "aliases": [ 00:28:09.814 "62fe4751-c1e5-410f-a0d6-ea75f375e8e7" 00:28:09.814 ], 00:28:09.814 "product_name": "Malloc disk", 00:28:09.814 "block_size": 4096, 00:28:09.814 "num_blocks": 8192, 00:28:09.814 "uuid": "62fe4751-c1e5-410f-a0d6-ea75f375e8e7", 00:28:09.814 "md_size": 32, 00:28:09.814 "md_interleave": false, 00:28:09.814 "dif_type": 0, 00:28:09.814 "assigned_rate_limits": { 00:28:09.814 "rw_ios_per_sec": 0, 00:28:09.814 "rw_mbytes_per_sec": 0, 00:28:09.814 "r_mbytes_per_sec": 0, 00:28:09.814 "w_mbytes_per_sec": 0 00:28:09.814 }, 00:28:09.814 "claimed": true, 00:28:09.814 "claim_type": "exclusive_write", 00:28:09.814 "zoned": false, 00:28:09.814 "supported_io_types": { 00:28:09.814 "read": true, 00:28:09.814 "write": true, 00:28:09.814 "unmap": true, 00:28:09.814 "flush": true, 00:28:09.814 "reset": true, 00:28:09.814 "nvme_admin": false, 00:28:09.814 "nvme_io": false, 00:28:09.814 "nvme_io_md": false, 00:28:09.814 "write_zeroes": true, 00:28:09.814 "zcopy": true, 00:28:09.814 "get_zone_info": false, 00:28:09.814 "zone_management": false, 00:28:09.814 "zone_append": false, 00:28:09.814 "compare": false, 00:28:09.814 "compare_and_write": false, 00:28:09.814 "abort": true, 00:28:09.814 "seek_hole": false, 00:28:09.814 "seek_data": false, 00:28:09.814 "copy": true, 00:28:09.814 "nvme_iov_md": false 00:28:09.814 }, 00:28:09.814 "memory_domains": [ 00:28:09.814 { 00:28:09.814 "dma_device_id": "system", 00:28:09.814 "dma_device_type": 1 00:28:09.814 }, 00:28:09.814 { 00:28:09.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:09.814 "dma_device_type": 2 00:28:09.814 } 00:28:09.814 ], 00:28:09.814 "driver_specific": {} 00:28:09.814 }' 00:28:09.814 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:09.814 10:35:46 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:10.073 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:10.331 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:10.331 "name": "BaseBdev2", 00:28:10.331 "aliases": [ 00:28:10.331 "baf7bb4e-30b9-484f-925c-7ddfae5cc12a" 00:28:10.331 ], 00:28:10.331 "product_name": "Malloc disk", 00:28:10.331 "block_size": 4096, 00:28:10.331 "num_blocks": 8192, 00:28:10.331 "uuid": "baf7bb4e-30b9-484f-925c-7ddfae5cc12a", 00:28:10.331 "md_size": 32, 00:28:10.331 "md_interleave": false, 00:28:10.331 "dif_type": 0, 00:28:10.331 "assigned_rate_limits": { 00:28:10.331 "rw_ios_per_sec": 0, 00:28:10.331 "rw_mbytes_per_sec": 0, 00:28:10.331 "r_mbytes_per_sec": 0, 00:28:10.331 "w_mbytes_per_sec": 0 00:28:10.331 }, 00:28:10.331 "claimed": true, 00:28:10.331 "claim_type": "exclusive_write", 00:28:10.331 "zoned": false, 00:28:10.331 "supported_io_types": { 00:28:10.331 "read": true, 00:28:10.331 "write": true, 00:28:10.331 "unmap": true, 00:28:10.331 "flush": true, 00:28:10.331 "reset": true, 00:28:10.331 "nvme_admin": false, 00:28:10.331 "nvme_io": false, 00:28:10.331 "nvme_io_md": false, 00:28:10.331 "write_zeroes": true, 00:28:10.331 "zcopy": true, 00:28:10.331 "get_zone_info": false, 00:28:10.331 "zone_management": false, 00:28:10.331 "zone_append": false, 00:28:10.331 "compare": false, 00:28:10.331 "compare_and_write": false, 00:28:10.331 "abort": true, 00:28:10.331 "seek_hole": false, 00:28:10.331 "seek_data": false, 00:28:10.331 "copy": true, 00:28:10.331 "nvme_iov_md": false 00:28:10.331 }, 00:28:10.331 "memory_domains": [ 00:28:10.331 { 00:28:10.331 "dma_device_id": "system", 00:28:10.331 "dma_device_type": 1 00:28:10.331 }, 00:28:10.331 { 00:28:10.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:10.331 "dma_device_type": 2 00:28:10.331 } 00:28:10.331 ], 00:28:10.331 "driver_specific": {} 00:28:10.331 }' 00:28:10.331 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.588 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:10.588 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:10.588 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.588 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:10.588 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:10.588 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.588 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:10.845 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:10.845 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.845 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:10.845 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:10.845 10:35:47 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:11.105 [2024-07-15 10:35:48.116721] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.105 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:11.362 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.363 "name": "Existed_Raid", 00:28:11.363 "uuid": "e7340923-9a4b-4440-8f47-7a982938f039", 00:28:11.363 "strip_size_kb": 0, 00:28:11.363 "state": "online", 00:28:11.363 "raid_level": "raid1", 00:28:11.363 "superblock": true, 00:28:11.363 "num_base_bdevs": 2, 00:28:11.363 "num_base_bdevs_discovered": 1, 00:28:11.363 "num_base_bdevs_operational": 1, 00:28:11.363 "base_bdevs_list": [ 00:28:11.363 { 00:28:11.363 "name": null, 00:28:11.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:11.363 "is_configured": false, 00:28:11.363 "data_offset": 256, 00:28:11.363 "data_size": 7936 00:28:11.363 }, 00:28:11.363 { 00:28:11.363 "name": "BaseBdev2", 00:28:11.363 "uuid": "baf7bb4e-30b9-484f-925c-7ddfae5cc12a", 00:28:11.363 "is_configured": true, 00:28:11.363 "data_offset": 256, 00:28:11.363 "data_size": 7936 00:28:11.363 } 00:28:11.363 ] 00:28:11.363 }' 00:28:11.363 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.363 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:11.927 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:11.927 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:11.927 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.927 10:35:48 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:12.184 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:12.185 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:12.185 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:12.442 [2024-07-15 10:35:49.446984] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:12.442 [2024-07-15 10:35:49.447071] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:12.442 [2024-07-15 10:35:49.458404] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:12.442 [2024-07-15 10:35:49.458439] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:12.442 [2024-07-15 10:35:49.458452] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d00210 name Existed_Raid, state offline 00:28:12.442 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:12.442 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:12.442 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.442 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 620769 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 620769 ']' 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 620769 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 620769 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 620769' 00:28:12.700 killing process with pid 620769 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 620769 00:28:12.700 [2024-07-15 10:35:49.762664] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:12.700 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 620769 00:28:12.700 [2024-07-15 10:35:49.763544] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:12.958 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:28:12.958 00:28:12.958 real 0m10.863s 00:28:12.958 user 0m19.375s 00:28:12.958 sys 0m2.043s 00:28:12.958 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:12.958 10:35:49 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:12.958 ************************************ 00:28:12.958 END TEST raid_state_function_test_sb_md_separate 00:28:12.958 ************************************ 00:28:12.958 10:35:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:12.958 10:35:50 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:28:12.958 10:35:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:12.958 10:35:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:12.958 10:35:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:12.958 ************************************ 00:28:12.958 START TEST raid_superblock_test_md_separate 00:28:12.958 ************************************ 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=622375 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 622375 /var/tmp/spdk-raid.sock 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 622375 ']' 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:12.958 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:12.958 10:35:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:12.958 [2024-07-15 10:35:50.113456] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:28:12.958 [2024-07-15 10:35:50.113525] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid622375 ] 00:28:13.216 [2024-07-15 10:35:50.239907] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:13.216 [2024-07-15 10:35:50.345818] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:13.473 [2024-07-15 10:35:50.414867] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:13.473 [2024-07-15 10:35:50.414907] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:14.035 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:28:14.291 malloc1 00:28:14.291 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:14.548 [2024-07-15 10:35:51.494033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:14.548 [2024-07-15 10:35:51.494083] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:14.548 [2024-07-15 10:35:51.494105] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb90830 00:28:14.548 [2024-07-15 10:35:51.494118] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:14.548 [2024-07-15 10:35:51.495683] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:14.548 [2024-07-15 10:35:51.495714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:14.548 pt1 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:14.548 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:28:14.548 malloc2 00:28:14.805 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:14.805 [2024-07-15 10:35:51.968768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:14.805 [2024-07-15 10:35:51.968820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:14.805 [2024-07-15 10:35:51.968841] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb82250 00:28:14.805 [2024-07-15 10:35:51.968854] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:14.805 [2024-07-15 10:35:51.970309] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:14.805 [2024-07-15 10:35:51.970339] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:14.805 pt2 00:28:14.805 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:14.805 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:14.805 10:35:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:15.062 [2024-07-15 10:35:52.137245] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:15.062 [2024-07-15 10:35:52.138683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:15.062 [2024-07-15 10:35:52.138836] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb82d20 00:28:15.062 [2024-07-15 10:35:52.138849] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:15.062 [2024-07-15 10:35:52.138934] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb76a60 00:28:15.062 [2024-07-15 10:35:52.139056] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb82d20 00:28:15.062 [2024-07-15 10:35:52.139067] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb82d20 00:28:15.062 [2024-07-15 10:35:52.139146] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.062 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.318 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.318 "name": "raid_bdev1", 00:28:15.318 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:15.318 "strip_size_kb": 0, 00:28:15.318 "state": "online", 00:28:15.318 "raid_level": "raid1", 00:28:15.318 "superblock": true, 00:28:15.318 "num_base_bdevs": 2, 00:28:15.318 "num_base_bdevs_discovered": 2, 00:28:15.318 "num_base_bdevs_operational": 2, 00:28:15.318 "base_bdevs_list": [ 00:28:15.318 { 00:28:15.318 "name": "pt1", 00:28:15.318 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:15.318 "is_configured": true, 00:28:15.318 "data_offset": 256, 00:28:15.318 "data_size": 7936 00:28:15.318 }, 00:28:15.318 { 00:28:15.318 "name": "pt2", 00:28:15.318 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:15.318 "is_configured": true, 00:28:15.318 "data_offset": 256, 00:28:15.318 "data_size": 7936 00:28:15.318 } 00:28:15.318 ] 00:28:15.318 }' 00:28:15.318 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.318 10:35:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:15.883 10:35:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:16.141 [2024-07-15 10:35:53.188268] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:16.141 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:16.141 "name": "raid_bdev1", 00:28:16.141 "aliases": [ 00:28:16.141 "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e" 00:28:16.141 ], 00:28:16.141 "product_name": "Raid Volume", 00:28:16.141 "block_size": 4096, 00:28:16.141 "num_blocks": 7936, 00:28:16.141 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:16.141 "md_size": 32, 00:28:16.141 "md_interleave": false, 00:28:16.141 "dif_type": 0, 00:28:16.141 "assigned_rate_limits": { 00:28:16.141 "rw_ios_per_sec": 0, 00:28:16.141 "rw_mbytes_per_sec": 0, 00:28:16.141 "r_mbytes_per_sec": 0, 00:28:16.141 "w_mbytes_per_sec": 0 00:28:16.141 }, 00:28:16.141 "claimed": false, 00:28:16.141 "zoned": false, 00:28:16.141 "supported_io_types": { 00:28:16.141 "read": true, 00:28:16.141 "write": true, 00:28:16.141 "unmap": false, 00:28:16.141 "flush": false, 00:28:16.141 "reset": true, 00:28:16.141 "nvme_admin": false, 00:28:16.141 "nvme_io": false, 00:28:16.141 "nvme_io_md": false, 00:28:16.141 "write_zeroes": true, 00:28:16.141 "zcopy": false, 00:28:16.141 "get_zone_info": false, 00:28:16.141 "zone_management": false, 00:28:16.141 "zone_append": false, 00:28:16.141 "compare": false, 00:28:16.141 "compare_and_write": false, 00:28:16.141 "abort": false, 00:28:16.141 "seek_hole": false, 00:28:16.141 "seek_data": false, 00:28:16.141 "copy": false, 00:28:16.141 "nvme_iov_md": false 00:28:16.141 }, 00:28:16.141 "memory_domains": [ 00:28:16.141 { 00:28:16.141 "dma_device_id": "system", 00:28:16.141 "dma_device_type": 1 00:28:16.141 }, 00:28:16.141 { 00:28:16.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:16.141 "dma_device_type": 2 00:28:16.141 }, 00:28:16.141 { 00:28:16.141 "dma_device_id": "system", 00:28:16.141 "dma_device_type": 1 00:28:16.141 }, 00:28:16.141 { 00:28:16.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:16.141 "dma_device_type": 2 00:28:16.141 } 00:28:16.141 ], 00:28:16.141 "driver_specific": { 00:28:16.141 "raid": { 00:28:16.141 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:16.141 "strip_size_kb": 0, 00:28:16.141 "state": "online", 00:28:16.141 "raid_level": "raid1", 00:28:16.141 "superblock": true, 00:28:16.141 "num_base_bdevs": 2, 00:28:16.141 "num_base_bdevs_discovered": 2, 00:28:16.141 "num_base_bdevs_operational": 2, 00:28:16.141 "base_bdevs_list": [ 00:28:16.141 { 00:28:16.141 "name": "pt1", 00:28:16.141 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:16.141 "is_configured": true, 00:28:16.141 "data_offset": 256, 00:28:16.141 "data_size": 7936 00:28:16.141 }, 00:28:16.141 { 00:28:16.141 "name": "pt2", 00:28:16.141 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:16.141 "is_configured": true, 00:28:16.141 "data_offset": 256, 00:28:16.141 "data_size": 7936 00:28:16.141 } 00:28:16.141 ] 00:28:16.141 } 00:28:16.141 } 00:28:16.141 }' 00:28:16.141 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:16.141 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:16.141 pt2' 00:28:16.141 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:16.141 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:16.141 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:16.399 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:16.399 "name": "pt1", 00:28:16.399 "aliases": [ 00:28:16.399 "00000000-0000-0000-0000-000000000001" 00:28:16.399 ], 00:28:16.399 "product_name": "passthru", 00:28:16.399 "block_size": 4096, 00:28:16.399 "num_blocks": 8192, 00:28:16.399 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:16.399 "md_size": 32, 00:28:16.399 "md_interleave": false, 00:28:16.399 "dif_type": 0, 00:28:16.399 "assigned_rate_limits": { 00:28:16.399 "rw_ios_per_sec": 0, 00:28:16.399 "rw_mbytes_per_sec": 0, 00:28:16.399 "r_mbytes_per_sec": 0, 00:28:16.399 "w_mbytes_per_sec": 0 00:28:16.399 }, 00:28:16.399 "claimed": true, 00:28:16.399 "claim_type": "exclusive_write", 00:28:16.399 "zoned": false, 00:28:16.399 "supported_io_types": { 00:28:16.399 "read": true, 00:28:16.399 "write": true, 00:28:16.399 "unmap": true, 00:28:16.399 "flush": true, 00:28:16.399 "reset": true, 00:28:16.399 "nvme_admin": false, 00:28:16.399 "nvme_io": false, 00:28:16.399 "nvme_io_md": false, 00:28:16.399 "write_zeroes": true, 00:28:16.399 "zcopy": true, 00:28:16.399 "get_zone_info": false, 00:28:16.399 "zone_management": false, 00:28:16.399 "zone_append": false, 00:28:16.399 "compare": false, 00:28:16.399 "compare_and_write": false, 00:28:16.399 "abort": true, 00:28:16.399 "seek_hole": false, 00:28:16.399 "seek_data": false, 00:28:16.399 "copy": true, 00:28:16.399 "nvme_iov_md": false 00:28:16.399 }, 00:28:16.399 "memory_domains": [ 00:28:16.399 { 00:28:16.399 "dma_device_id": "system", 00:28:16.399 "dma_device_type": 1 00:28:16.399 }, 00:28:16.399 { 00:28:16.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:16.399 "dma_device_type": 2 00:28:16.399 } 00:28:16.399 ], 00:28:16.399 "driver_specific": { 00:28:16.399 "passthru": { 00:28:16.399 "name": "pt1", 00:28:16.399 "base_bdev_name": "malloc1" 00:28:16.399 } 00:28:16.399 } 00:28:16.399 }' 00:28:16.399 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:16.399 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:16.399 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:16.399 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:16.657 10:35:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:16.913 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:16.913 "name": "pt2", 00:28:16.913 "aliases": [ 00:28:16.913 "00000000-0000-0000-0000-000000000002" 00:28:16.913 ], 00:28:16.913 "product_name": "passthru", 00:28:16.913 "block_size": 4096, 00:28:16.913 "num_blocks": 8192, 00:28:16.913 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:16.913 "md_size": 32, 00:28:16.913 "md_interleave": false, 00:28:16.913 "dif_type": 0, 00:28:16.913 "assigned_rate_limits": { 00:28:16.913 "rw_ios_per_sec": 0, 00:28:16.913 "rw_mbytes_per_sec": 0, 00:28:16.913 "r_mbytes_per_sec": 0, 00:28:16.913 "w_mbytes_per_sec": 0 00:28:16.913 }, 00:28:16.913 "claimed": true, 00:28:16.913 "claim_type": "exclusive_write", 00:28:16.913 "zoned": false, 00:28:16.913 "supported_io_types": { 00:28:16.913 "read": true, 00:28:16.913 "write": true, 00:28:16.913 "unmap": true, 00:28:16.914 "flush": true, 00:28:16.914 "reset": true, 00:28:16.914 "nvme_admin": false, 00:28:16.914 "nvme_io": false, 00:28:16.914 "nvme_io_md": false, 00:28:16.914 "write_zeroes": true, 00:28:16.914 "zcopy": true, 00:28:16.914 "get_zone_info": false, 00:28:16.914 "zone_management": false, 00:28:16.914 "zone_append": false, 00:28:16.914 "compare": false, 00:28:16.914 "compare_and_write": false, 00:28:16.914 "abort": true, 00:28:16.914 "seek_hole": false, 00:28:16.914 "seek_data": false, 00:28:16.914 "copy": true, 00:28:16.914 "nvme_iov_md": false 00:28:16.914 }, 00:28:16.914 "memory_domains": [ 00:28:16.914 { 00:28:16.914 "dma_device_id": "system", 00:28:16.914 "dma_device_type": 1 00:28:16.914 }, 00:28:16.914 { 00:28:16.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:16.914 "dma_device_type": 2 00:28:16.914 } 00:28:16.914 ], 00:28:16.914 "driver_specific": { 00:28:16.914 "passthru": { 00:28:16.914 "name": "pt2", 00:28:16.914 "base_bdev_name": "malloc2" 00:28:16.914 } 00:28:16.914 } 00:28:16.914 }' 00:28:16.914 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:17.170 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:17.426 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:17.426 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:17.426 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:17.426 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:17.684 [2024-07-15 10:35:54.668190] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:17.684 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ca89f63d-1c0c-4e90-ba2e-8f2f347f490e 00:28:17.684 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z ca89f63d-1c0c-4e90-ba2e-8f2f347f490e ']' 00:28:17.684 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:17.941 [2024-07-15 10:35:54.916582] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:17.941 [2024-07-15 10:35:54.916608] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:17.941 [2024-07-15 10:35:54.916668] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:17.941 [2024-07-15 10:35:54.916724] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:17.941 [2024-07-15 10:35:54.916736] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb82d20 name raid_bdev1, state offline 00:28:17.941 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.941 10:35:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:18.199 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:18.199 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:18.199 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:18.199 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:18.456 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:18.456 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:18.714 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:18.972 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:18.972 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:18.972 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:18.972 10:35:55 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:18.972 [2024-07-15 10:35:56.071603] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:18.972 [2024-07-15 10:35:56.072948] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:18.972 [2024-07-15 10:35:56.073004] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:18.972 [2024-07-15 10:35:56.073045] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:18.972 [2024-07-15 10:35:56.073064] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:18.972 [2024-07-15 10:35:56.073074] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9f2ed0 name raid_bdev1, state configuring 00:28:18.972 request: 00:28:18.972 { 00:28:18.972 "name": "raid_bdev1", 00:28:18.972 "raid_level": "raid1", 00:28:18.972 "base_bdevs": [ 00:28:18.972 "malloc1", 00:28:18.972 "malloc2" 00:28:18.972 ], 00:28:18.972 "superblock": false, 00:28:18.972 "method": "bdev_raid_create", 00:28:18.972 "req_id": 1 00:28:18.972 } 00:28:18.972 Got JSON-RPC error response 00:28:18.972 response: 00:28:18.972 { 00:28:18.972 "code": -17, 00:28:18.972 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:18.972 } 00:28:18.972 10:35:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:18.972 10:35:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:18.972 10:35:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:18.972 10:35:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:18.972 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.972 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:19.230 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:19.230 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:19.230 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:19.487 [2024-07-15 10:35:56.576875] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:19.487 [2024-07-15 10:35:56.576934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:19.487 [2024-07-15 10:35:56.576955] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb90ee0 00:28:19.487 [2024-07-15 10:35:56.576967] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:19.487 [2024-07-15 10:35:56.578465] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:19.487 [2024-07-15 10:35:56.578493] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:19.487 [2024-07-15 10:35:56.578543] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:19.487 [2024-07-15 10:35:56.578568] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:19.487 pt1 00:28:19.487 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.488 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.746 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:19.746 "name": "raid_bdev1", 00:28:19.746 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:19.746 "strip_size_kb": 0, 00:28:19.746 "state": "configuring", 00:28:19.746 "raid_level": "raid1", 00:28:19.746 "superblock": true, 00:28:19.746 "num_base_bdevs": 2, 00:28:19.746 "num_base_bdevs_discovered": 1, 00:28:19.746 "num_base_bdevs_operational": 2, 00:28:19.746 "base_bdevs_list": [ 00:28:19.746 { 00:28:19.746 "name": "pt1", 00:28:19.746 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:19.746 "is_configured": true, 00:28:19.746 "data_offset": 256, 00:28:19.746 "data_size": 7936 00:28:19.746 }, 00:28:19.746 { 00:28:19.746 "name": null, 00:28:19.746 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:19.746 "is_configured": false, 00:28:19.746 "data_offset": 256, 00:28:19.746 "data_size": 7936 00:28:19.746 } 00:28:19.746 ] 00:28:19.746 }' 00:28:19.746 10:35:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:19.746 10:35:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:20.311 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:20.311 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:20.311 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:20.311 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:20.570 [2024-07-15 10:35:57.671778] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:20.570 [2024-07-15 10:35:57.671833] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:20.570 [2024-07-15 10:35:57.671852] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f3490 00:28:20.570 [2024-07-15 10:35:57.671865] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:20.570 [2024-07-15 10:35:57.672073] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:20.570 [2024-07-15 10:35:57.672093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:20.570 [2024-07-15 10:35:57.672141] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:20.570 [2024-07-15 10:35:57.672160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:20.570 [2024-07-15 10:35:57.672253] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb775d0 00:28:20.570 [2024-07-15 10:35:57.672265] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:20.570 [2024-07-15 10:35:57.672321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb78800 00:28:20.570 [2024-07-15 10:35:57.672422] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb775d0 00:28:20.570 [2024-07-15 10:35:57.672432] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb775d0 00:28:20.570 [2024-07-15 10:35:57.672501] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:20.570 pt2 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.570 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.828 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.828 "name": "raid_bdev1", 00:28:20.828 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:20.828 "strip_size_kb": 0, 00:28:20.828 "state": "online", 00:28:20.828 "raid_level": "raid1", 00:28:20.828 "superblock": true, 00:28:20.828 "num_base_bdevs": 2, 00:28:20.828 "num_base_bdevs_discovered": 2, 00:28:20.828 "num_base_bdevs_operational": 2, 00:28:20.828 "base_bdevs_list": [ 00:28:20.828 { 00:28:20.828 "name": "pt1", 00:28:20.828 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:20.828 "is_configured": true, 00:28:20.828 "data_offset": 256, 00:28:20.828 "data_size": 7936 00:28:20.828 }, 00:28:20.828 { 00:28:20.828 "name": "pt2", 00:28:20.828 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:20.828 "is_configured": true, 00:28:20.828 "data_offset": 256, 00:28:20.828 "data_size": 7936 00:28:20.828 } 00:28:20.828 ] 00:28:20.828 }' 00:28:20.828 10:35:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.828 10:35:57 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:21.394 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:21.653 [2024-07-15 10:35:58.754898] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:21.653 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:21.653 "name": "raid_bdev1", 00:28:21.653 "aliases": [ 00:28:21.653 "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e" 00:28:21.653 ], 00:28:21.653 "product_name": "Raid Volume", 00:28:21.653 "block_size": 4096, 00:28:21.653 "num_blocks": 7936, 00:28:21.653 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:21.653 "md_size": 32, 00:28:21.653 "md_interleave": false, 00:28:21.653 "dif_type": 0, 00:28:21.653 "assigned_rate_limits": { 00:28:21.653 "rw_ios_per_sec": 0, 00:28:21.653 "rw_mbytes_per_sec": 0, 00:28:21.653 "r_mbytes_per_sec": 0, 00:28:21.653 "w_mbytes_per_sec": 0 00:28:21.653 }, 00:28:21.653 "claimed": false, 00:28:21.653 "zoned": false, 00:28:21.653 "supported_io_types": { 00:28:21.653 "read": true, 00:28:21.653 "write": true, 00:28:21.653 "unmap": false, 00:28:21.653 "flush": false, 00:28:21.653 "reset": true, 00:28:21.653 "nvme_admin": false, 00:28:21.653 "nvme_io": false, 00:28:21.653 "nvme_io_md": false, 00:28:21.653 "write_zeroes": true, 00:28:21.653 "zcopy": false, 00:28:21.653 "get_zone_info": false, 00:28:21.653 "zone_management": false, 00:28:21.653 "zone_append": false, 00:28:21.653 "compare": false, 00:28:21.653 "compare_and_write": false, 00:28:21.653 "abort": false, 00:28:21.653 "seek_hole": false, 00:28:21.653 "seek_data": false, 00:28:21.653 "copy": false, 00:28:21.653 "nvme_iov_md": false 00:28:21.653 }, 00:28:21.653 "memory_domains": [ 00:28:21.653 { 00:28:21.653 "dma_device_id": "system", 00:28:21.653 "dma_device_type": 1 00:28:21.653 }, 00:28:21.653 { 00:28:21.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:21.653 "dma_device_type": 2 00:28:21.653 }, 00:28:21.653 { 00:28:21.653 "dma_device_id": "system", 00:28:21.653 "dma_device_type": 1 00:28:21.653 }, 00:28:21.653 { 00:28:21.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:21.653 "dma_device_type": 2 00:28:21.653 } 00:28:21.653 ], 00:28:21.653 "driver_specific": { 00:28:21.653 "raid": { 00:28:21.653 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:21.653 "strip_size_kb": 0, 00:28:21.653 "state": "online", 00:28:21.653 "raid_level": "raid1", 00:28:21.653 "superblock": true, 00:28:21.653 "num_base_bdevs": 2, 00:28:21.653 "num_base_bdevs_discovered": 2, 00:28:21.653 "num_base_bdevs_operational": 2, 00:28:21.653 "base_bdevs_list": [ 00:28:21.653 { 00:28:21.653 "name": "pt1", 00:28:21.653 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:21.653 "is_configured": true, 00:28:21.653 "data_offset": 256, 00:28:21.653 "data_size": 7936 00:28:21.653 }, 00:28:21.653 { 00:28:21.653 "name": "pt2", 00:28:21.653 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:21.653 "is_configured": true, 00:28:21.653 "data_offset": 256, 00:28:21.653 "data_size": 7936 00:28:21.653 } 00:28:21.653 ] 00:28:21.653 } 00:28:21.653 } 00:28:21.653 }' 00:28:21.653 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:21.653 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:21.653 pt2' 00:28:21.653 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:21.653 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:21.653 10:35:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:21.911 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:21.911 "name": "pt1", 00:28:21.911 "aliases": [ 00:28:21.911 "00000000-0000-0000-0000-000000000001" 00:28:21.911 ], 00:28:21.911 "product_name": "passthru", 00:28:21.911 "block_size": 4096, 00:28:21.911 "num_blocks": 8192, 00:28:21.911 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:21.911 "md_size": 32, 00:28:21.911 "md_interleave": false, 00:28:21.911 "dif_type": 0, 00:28:21.911 "assigned_rate_limits": { 00:28:21.911 "rw_ios_per_sec": 0, 00:28:21.911 "rw_mbytes_per_sec": 0, 00:28:21.911 "r_mbytes_per_sec": 0, 00:28:21.911 "w_mbytes_per_sec": 0 00:28:21.911 }, 00:28:21.911 "claimed": true, 00:28:21.911 "claim_type": "exclusive_write", 00:28:21.911 "zoned": false, 00:28:21.911 "supported_io_types": { 00:28:21.911 "read": true, 00:28:21.911 "write": true, 00:28:21.911 "unmap": true, 00:28:21.911 "flush": true, 00:28:21.911 "reset": true, 00:28:21.911 "nvme_admin": false, 00:28:21.911 "nvme_io": false, 00:28:21.911 "nvme_io_md": false, 00:28:21.911 "write_zeroes": true, 00:28:21.911 "zcopy": true, 00:28:21.911 "get_zone_info": false, 00:28:21.911 "zone_management": false, 00:28:21.911 "zone_append": false, 00:28:21.911 "compare": false, 00:28:21.911 "compare_and_write": false, 00:28:21.911 "abort": true, 00:28:21.911 "seek_hole": false, 00:28:21.911 "seek_data": false, 00:28:21.911 "copy": true, 00:28:21.911 "nvme_iov_md": false 00:28:21.911 }, 00:28:21.911 "memory_domains": [ 00:28:21.911 { 00:28:21.911 "dma_device_id": "system", 00:28:21.911 "dma_device_type": 1 00:28:21.911 }, 00:28:21.911 { 00:28:21.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:21.912 "dma_device_type": 2 00:28:21.912 } 00:28:21.912 ], 00:28:21.912 "driver_specific": { 00:28:21.912 "passthru": { 00:28:21.912 "name": "pt1", 00:28:21.912 "base_bdev_name": "malloc1" 00:28:21.912 } 00:28:21.912 } 00:28:21.912 }' 00:28:21.912 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:21.912 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:22.169 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.426 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.426 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:22.426 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:22.426 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:22.426 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:22.684 "name": "pt2", 00:28:22.684 "aliases": [ 00:28:22.684 "00000000-0000-0000-0000-000000000002" 00:28:22.684 ], 00:28:22.684 "product_name": "passthru", 00:28:22.684 "block_size": 4096, 00:28:22.684 "num_blocks": 8192, 00:28:22.684 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:22.684 "md_size": 32, 00:28:22.684 "md_interleave": false, 00:28:22.684 "dif_type": 0, 00:28:22.684 "assigned_rate_limits": { 00:28:22.684 "rw_ios_per_sec": 0, 00:28:22.684 "rw_mbytes_per_sec": 0, 00:28:22.684 "r_mbytes_per_sec": 0, 00:28:22.684 "w_mbytes_per_sec": 0 00:28:22.684 }, 00:28:22.684 "claimed": true, 00:28:22.684 "claim_type": "exclusive_write", 00:28:22.684 "zoned": false, 00:28:22.684 "supported_io_types": { 00:28:22.684 "read": true, 00:28:22.684 "write": true, 00:28:22.684 "unmap": true, 00:28:22.684 "flush": true, 00:28:22.684 "reset": true, 00:28:22.684 "nvme_admin": false, 00:28:22.684 "nvme_io": false, 00:28:22.684 "nvme_io_md": false, 00:28:22.684 "write_zeroes": true, 00:28:22.684 "zcopy": true, 00:28:22.684 "get_zone_info": false, 00:28:22.684 "zone_management": false, 00:28:22.684 "zone_append": false, 00:28:22.684 "compare": false, 00:28:22.684 "compare_and_write": false, 00:28:22.684 "abort": true, 00:28:22.684 "seek_hole": false, 00:28:22.684 "seek_data": false, 00:28:22.684 "copy": true, 00:28:22.684 "nvme_iov_md": false 00:28:22.684 }, 00:28:22.684 "memory_domains": [ 00:28:22.684 { 00:28:22.684 "dma_device_id": "system", 00:28:22.684 "dma_device_type": 1 00:28:22.684 }, 00:28:22.684 { 00:28:22.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.684 "dma_device_type": 2 00:28:22.684 } 00:28:22.684 ], 00:28:22.684 "driver_specific": { 00:28:22.684 "passthru": { 00:28:22.684 "name": "pt2", 00:28:22.684 "base_bdev_name": "malloc2" 00:28:22.684 } 00:28:22.684 } 00:28:22.684 }' 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.684 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:22.942 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:28:22.942 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.942 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:22.942 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:22.942 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:22.942 10:35:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:23.200 [2024-07-15 10:36:00.198752] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:23.200 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' ca89f63d-1c0c-4e90-ba2e-8f2f347f490e '!=' ca89f63d-1c0c-4e90-ba2e-8f2f347f490e ']' 00:28:23.200 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:23.200 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:23.200 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:28:23.200 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:23.458 [2024-07-15 10:36:00.443162] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.458 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.716 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.716 "name": "raid_bdev1", 00:28:23.716 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:23.716 "strip_size_kb": 0, 00:28:23.716 "state": "online", 00:28:23.716 "raid_level": "raid1", 00:28:23.716 "superblock": true, 00:28:23.716 "num_base_bdevs": 2, 00:28:23.716 "num_base_bdevs_discovered": 1, 00:28:23.716 "num_base_bdevs_operational": 1, 00:28:23.716 "base_bdevs_list": [ 00:28:23.716 { 00:28:23.716 "name": null, 00:28:23.716 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.716 "is_configured": false, 00:28:23.716 "data_offset": 256, 00:28:23.716 "data_size": 7936 00:28:23.716 }, 00:28:23.716 { 00:28:23.716 "name": "pt2", 00:28:23.716 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:23.716 "is_configured": true, 00:28:23.716 "data_offset": 256, 00:28:23.716 "data_size": 7936 00:28:23.716 } 00:28:23.716 ] 00:28:23.716 }' 00:28:23.716 10:36:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.716 10:36:00 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:24.282 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:24.539 [2024-07-15 10:36:01.542055] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:24.539 [2024-07-15 10:36:01.542083] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:24.539 [2024-07-15 10:36:01.542142] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:24.539 [2024-07-15 10:36:01.542190] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:24.539 [2024-07-15 10:36:01.542202] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb775d0 name raid_bdev1, state offline 00:28:24.539 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.539 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:24.797 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:24.797 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:24.797 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:24.797 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:24.797 10:36:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:25.053 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:25.053 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:25.053 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:25.053 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:25.053 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:28:25.053 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:25.309 [2024-07-15 10:36:02.267957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:25.309 [2024-07-15 10:36:02.268008] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:25.309 [2024-07-15 10:36:02.268028] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb75660 00:28:25.309 [2024-07-15 10:36:02.268041] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:25.309 [2024-07-15 10:36:02.269572] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:25.309 [2024-07-15 10:36:02.269603] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:25.309 [2024-07-15 10:36:02.269653] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:25.309 [2024-07-15 10:36:02.269679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:25.309 [2024-07-15 10:36:02.269762] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb77d10 00:28:25.309 [2024-07-15 10:36:02.269773] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:25.309 [2024-07-15 10:36:02.269832] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb78560 00:28:25.309 [2024-07-15 10:36:02.269943] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb77d10 00:28:25.309 [2024-07-15 10:36:02.269954] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb77d10 00:28:25.309 [2024-07-15 10:36:02.270026] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:25.309 pt2 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.309 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.567 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.567 "name": "raid_bdev1", 00:28:25.567 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:25.567 "strip_size_kb": 0, 00:28:25.567 "state": "online", 00:28:25.567 "raid_level": "raid1", 00:28:25.567 "superblock": true, 00:28:25.567 "num_base_bdevs": 2, 00:28:25.567 "num_base_bdevs_discovered": 1, 00:28:25.567 "num_base_bdevs_operational": 1, 00:28:25.567 "base_bdevs_list": [ 00:28:25.567 { 00:28:25.567 "name": null, 00:28:25.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.567 "is_configured": false, 00:28:25.567 "data_offset": 256, 00:28:25.567 "data_size": 7936 00:28:25.567 }, 00:28:25.567 { 00:28:25.567 "name": "pt2", 00:28:25.567 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:25.567 "is_configured": true, 00:28:25.567 "data_offset": 256, 00:28:25.567 "data_size": 7936 00:28:25.567 } 00:28:25.567 ] 00:28:25.567 }' 00:28:25.567 10:36:02 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.567 10:36:02 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:26.131 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:26.389 [2024-07-15 10:36:03.402948] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:26.389 [2024-07-15 10:36:03.402974] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:26.389 [2024-07-15 10:36:03.403028] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:26.389 [2024-07-15 10:36:03.403074] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:26.389 [2024-07-15 10:36:03.403086] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb77d10 name raid_bdev1, state offline 00:28:26.389 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.389 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:26.646 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:26.646 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:26.646 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:26.646 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:26.903 [2024-07-15 10:36:03.892219] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:26.903 [2024-07-15 10:36:03.892267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:26.903 [2024-07-15 10:36:03.892285] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb76760 00:28:26.903 [2024-07-15 10:36:03.892298] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:26.903 [2024-07-15 10:36:03.893765] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:26.903 [2024-07-15 10:36:03.893796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:26.903 [2024-07-15 10:36:03.893845] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:26.903 [2024-07-15 10:36:03.893869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:26.903 [2024-07-15 10:36:03.893973] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:26.903 [2024-07-15 10:36:03.893987] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:26.903 [2024-07-15 10:36:03.894001] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb78850 name raid_bdev1, state configuring 00:28:26.903 [2024-07-15 10:36:03.894025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:26.903 [2024-07-15 10:36:03.894079] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb77850 00:28:26.903 [2024-07-15 10:36:03.894096] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:26.903 [2024-07-15 10:36:03.894153] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb783b0 00:28:26.903 [2024-07-15 10:36:03.894251] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb77850 00:28:26.903 [2024-07-15 10:36:03.894260] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb77850 00:28:26.903 [2024-07-15 10:36:03.894333] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:26.903 pt1 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:26.903 10:36:03 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.160 10:36:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:27.160 "name": "raid_bdev1", 00:28:27.160 "uuid": "ca89f63d-1c0c-4e90-ba2e-8f2f347f490e", 00:28:27.160 "strip_size_kb": 0, 00:28:27.160 "state": "online", 00:28:27.160 "raid_level": "raid1", 00:28:27.160 "superblock": true, 00:28:27.160 "num_base_bdevs": 2, 00:28:27.160 "num_base_bdevs_discovered": 1, 00:28:27.160 "num_base_bdevs_operational": 1, 00:28:27.160 "base_bdevs_list": [ 00:28:27.160 { 00:28:27.160 "name": null, 00:28:27.160 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:27.160 "is_configured": false, 00:28:27.160 "data_offset": 256, 00:28:27.160 "data_size": 7936 00:28:27.160 }, 00:28:27.160 { 00:28:27.160 "name": "pt2", 00:28:27.160 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:27.160 "is_configured": true, 00:28:27.160 "data_offset": 256, 00:28:27.160 "data_size": 7936 00:28:27.160 } 00:28:27.160 ] 00:28:27.160 }' 00:28:27.160 10:36:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:27.160 10:36:04 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:27.724 10:36:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:27.724 10:36:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:27.980 10:36:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:27.980 10:36:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:27.980 10:36:04 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:28.238 [2024-07-15 10:36:05.199922] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:28.238 10:36:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' ca89f63d-1c0c-4e90-ba2e-8f2f347f490e '!=' ca89f63d-1c0c-4e90-ba2e-8f2f347f490e ']' 00:28:28.238 10:36:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 622375 00:28:28.238 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 622375 ']' 00:28:28.238 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 622375 00:28:28.238 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:28.239 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:28.239 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 622375 00:28:28.239 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:28.239 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:28.239 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 622375' 00:28:28.239 killing process with pid 622375 00:28:28.239 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 622375 00:28:28.239 [2024-07-15 10:36:05.289713] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:28.239 [2024-07-15 10:36:05.289769] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:28.239 [2024-07-15 10:36:05.289814] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:28.239 [2024-07-15 10:36:05.289827] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb77850 name raid_bdev1, state offline 00:28:28.239 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 622375 00:28:28.239 [2024-07-15 10:36:05.311911] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:28.496 10:36:05 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:28:28.497 00:28:28.497 real 0m15.469s 00:28:28.497 user 0m28.008s 00:28:28.497 sys 0m2.876s 00:28:28.497 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:28.497 10:36:05 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:28.497 ************************************ 00:28:28.497 END TEST raid_superblock_test_md_separate 00:28:28.497 ************************************ 00:28:28.497 10:36:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:28.497 10:36:05 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:28:28.497 10:36:05 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:28:28.497 10:36:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:28.497 10:36:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:28.497 10:36:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:28.497 ************************************ 00:28:28.497 START TEST raid_rebuild_test_sb_md_separate 00:28:28.497 ************************************ 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=625040 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 625040 /var/tmp/spdk-raid.sock 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 625040 ']' 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:28.497 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:28.497 10:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:28.497 [2024-07-15 10:36:05.660787] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:28:28.497 [2024-07-15 10:36:05.660853] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid625040 ] 00:28:28.497 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:28.497 Zero copy mechanism will not be used. 00:28:28.755 [2024-07-15 10:36:05.779262] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:28.755 [2024-07-15 10:36:05.883573] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:29.011 [2024-07-15 10:36:05.955832] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:29.011 [2024-07-15 10:36:05.955868] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:29.575 10:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:29.575 10:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:28:29.575 10:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:29.575 10:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:28:29.832 BaseBdev1_malloc 00:28:29.832 10:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:30.130 [2024-07-15 10:36:07.066021] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:30.130 [2024-07-15 10:36:07.066069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:30.130 [2024-07-15 10:36:07.066094] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c86d0 00:28:30.130 [2024-07-15 10:36:07.066107] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:30.130 [2024-07-15 10:36:07.067589] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:30.130 [2024-07-15 10:36:07.067618] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:30.130 BaseBdev1 00:28:30.130 10:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:30.130 10:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:28:30.130 BaseBdev2_malloc 00:28:30.442 10:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:30.442 [2024-07-15 10:36:07.564741] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:30.442 [2024-07-15 10:36:07.564787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:30.442 [2024-07-15 10:36:07.564808] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25201f0 00:28:30.442 [2024-07-15 10:36:07.564821] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:30.442 [2024-07-15 10:36:07.566205] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:30.442 [2024-07-15 10:36:07.566233] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:30.442 BaseBdev2 00:28:30.442 10:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:28:30.700 spare_malloc 00:28:30.700 10:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:30.957 spare_delay 00:28:30.957 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:31.237 [2024-07-15 10:36:08.309427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:31.237 [2024-07-15 10:36:08.309474] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:31.237 [2024-07-15 10:36:08.309498] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251c7a0 00:28:31.237 [2024-07-15 10:36:08.309510] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:31.237 [2024-07-15 10:36:08.310964] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:31.237 [2024-07-15 10:36:08.310994] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:31.237 spare 00:28:31.237 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:31.494 [2024-07-15 10:36:08.546076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:31.494 [2024-07-15 10:36:08.547420] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:31.494 [2024-07-15 10:36:08.547585] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x251d1c0 00:28:31.494 [2024-07-15 10:36:08.547603] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:31.494 [2024-07-15 10:36:08.547677] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x242e360 00:28:31.494 [2024-07-15 10:36:08.547792] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x251d1c0 00:28:31.494 [2024-07-15 10:36:08.547801] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x251d1c0 00:28:31.494 [2024-07-15 10:36:08.547872] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.494 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.752 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.752 "name": "raid_bdev1", 00:28:31.752 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:31.752 "strip_size_kb": 0, 00:28:31.752 "state": "online", 00:28:31.752 "raid_level": "raid1", 00:28:31.752 "superblock": true, 00:28:31.752 "num_base_bdevs": 2, 00:28:31.752 "num_base_bdevs_discovered": 2, 00:28:31.752 "num_base_bdevs_operational": 2, 00:28:31.752 "base_bdevs_list": [ 00:28:31.752 { 00:28:31.752 "name": "BaseBdev1", 00:28:31.752 "uuid": "ecaffd2f-e47b-5e62-afd0-0063e9897998", 00:28:31.752 "is_configured": true, 00:28:31.752 "data_offset": 256, 00:28:31.752 "data_size": 7936 00:28:31.752 }, 00:28:31.752 { 00:28:31.752 "name": "BaseBdev2", 00:28:31.752 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:31.752 "is_configured": true, 00:28:31.752 "data_offset": 256, 00:28:31.752 "data_size": 7936 00:28:31.752 } 00:28:31.752 ] 00:28:31.752 }' 00:28:31.752 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.752 10:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:32.315 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:32.315 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:32.572 [2024-07-15 10:36:09.641381] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:32.572 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:32.572 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:32.572 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:32.830 10:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:33.395 [2024-07-15 10:36:10.394993] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x242e360 00:28:33.395 /dev/nbd0 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:33.395 1+0 records in 00:28:33.395 1+0 records out 00:28:33.395 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235146 s, 17.4 MB/s 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:33.395 10:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:34.325 7936+0 records in 00:28:34.325 7936+0 records out 00:28:34.325 32505856 bytes (33 MB, 31 MiB) copied, 0.739105 s, 44.0 MB/s 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:34.325 [2024-07-15 10:36:11.407332] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:34.325 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:34.582 [2024-07-15 10:36:11.648011] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.582 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:34.839 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:34.839 "name": "raid_bdev1", 00:28:34.839 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:34.839 "strip_size_kb": 0, 00:28:34.839 "state": "online", 00:28:34.839 "raid_level": "raid1", 00:28:34.839 "superblock": true, 00:28:34.839 "num_base_bdevs": 2, 00:28:34.839 "num_base_bdevs_discovered": 1, 00:28:34.839 "num_base_bdevs_operational": 1, 00:28:34.839 "base_bdevs_list": [ 00:28:34.839 { 00:28:34.839 "name": null, 00:28:34.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:34.839 "is_configured": false, 00:28:34.839 "data_offset": 256, 00:28:34.839 "data_size": 7936 00:28:34.839 }, 00:28:34.839 { 00:28:34.839 "name": "BaseBdev2", 00:28:34.839 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:34.839 "is_configured": true, 00:28:34.839 "data_offset": 256, 00:28:34.839 "data_size": 7936 00:28:34.839 } 00:28:34.839 ] 00:28:34.839 }' 00:28:34.839 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:34.839 10:36:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:35.402 10:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:35.659 [2024-07-15 10:36:12.682760] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:35.659 [2024-07-15 10:36:12.685064] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c7350 00:28:35.659 [2024-07-15 10:36:12.687344] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:35.659 10:36:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:36.587 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:36.587 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:36.587 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:36.587 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:36.587 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:36.587 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.587 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.842 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:36.842 "name": "raid_bdev1", 00:28:36.842 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:36.842 "strip_size_kb": 0, 00:28:36.842 "state": "online", 00:28:36.842 "raid_level": "raid1", 00:28:36.842 "superblock": true, 00:28:36.842 "num_base_bdevs": 2, 00:28:36.842 "num_base_bdevs_discovered": 2, 00:28:36.842 "num_base_bdevs_operational": 2, 00:28:36.842 "process": { 00:28:36.842 "type": "rebuild", 00:28:36.842 "target": "spare", 00:28:36.842 "progress": { 00:28:36.842 "blocks": 3072, 00:28:36.842 "percent": 38 00:28:36.842 } 00:28:36.842 }, 00:28:36.842 "base_bdevs_list": [ 00:28:36.842 { 00:28:36.842 "name": "spare", 00:28:36.842 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:36.842 "is_configured": true, 00:28:36.842 "data_offset": 256, 00:28:36.842 "data_size": 7936 00:28:36.842 }, 00:28:36.842 { 00:28:36.842 "name": "BaseBdev2", 00:28:36.842 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:36.842 "is_configured": true, 00:28:36.842 "data_offset": 256, 00:28:36.842 "data_size": 7936 00:28:36.842 } 00:28:36.842 ] 00:28:36.842 }' 00:28:36.842 10:36:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:36.842 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:36.842 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:37.098 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:37.098 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:37.354 [2024-07-15 10:36:14.534056] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:37.610 [2024-07-15 10:36:14.602229] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:37.610 [2024-07-15 10:36:14.602277] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:37.610 [2024-07-15 10:36:14.602292] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:37.610 [2024-07-15 10:36:14.602301] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.610 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:37.867 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:37.867 "name": "raid_bdev1", 00:28:37.867 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:37.867 "strip_size_kb": 0, 00:28:37.867 "state": "online", 00:28:37.867 "raid_level": "raid1", 00:28:37.867 "superblock": true, 00:28:37.867 "num_base_bdevs": 2, 00:28:37.867 "num_base_bdevs_discovered": 1, 00:28:37.867 "num_base_bdevs_operational": 1, 00:28:37.867 "base_bdevs_list": [ 00:28:37.867 { 00:28:37.867 "name": null, 00:28:37.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:37.867 "is_configured": false, 00:28:37.867 "data_offset": 256, 00:28:37.867 "data_size": 7936 00:28:37.867 }, 00:28:37.867 { 00:28:37.867 "name": "BaseBdev2", 00:28:37.867 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:37.867 "is_configured": true, 00:28:37.867 "data_offset": 256, 00:28:37.867 "data_size": 7936 00:28:37.867 } 00:28:37.867 ] 00:28:37.867 }' 00:28:37.867 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:37.867 10:36:14 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:38.429 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:38.429 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:38.429 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:38.429 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:38.429 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:38.429 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.429 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:38.687 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:38.687 "name": "raid_bdev1", 00:28:38.687 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:38.687 "strip_size_kb": 0, 00:28:38.687 "state": "online", 00:28:38.687 "raid_level": "raid1", 00:28:38.687 "superblock": true, 00:28:38.687 "num_base_bdevs": 2, 00:28:38.687 "num_base_bdevs_discovered": 1, 00:28:38.687 "num_base_bdevs_operational": 1, 00:28:38.687 "base_bdevs_list": [ 00:28:38.687 { 00:28:38.687 "name": null, 00:28:38.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:38.687 "is_configured": false, 00:28:38.687 "data_offset": 256, 00:28:38.687 "data_size": 7936 00:28:38.687 }, 00:28:38.687 { 00:28:38.687 "name": "BaseBdev2", 00:28:38.687 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:38.687 "is_configured": true, 00:28:38.687 "data_offset": 256, 00:28:38.687 "data_size": 7936 00:28:38.687 } 00:28:38.687 ] 00:28:38.687 }' 00:28:38.687 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:38.687 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:38.687 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:38.687 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:38.687 10:36:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:38.944 [2024-07-15 10:36:16.034277] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:38.945 [2024-07-15 10:36:16.036879] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c8280 00:28:38.945 [2024-07-15 10:36:16.038469] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:38.945 10:36:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:39.875 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:39.875 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:39.875 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:39.875 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:39.875 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:39.875 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.875 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.132 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:40.132 "name": "raid_bdev1", 00:28:40.132 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:40.132 "strip_size_kb": 0, 00:28:40.132 "state": "online", 00:28:40.132 "raid_level": "raid1", 00:28:40.132 "superblock": true, 00:28:40.132 "num_base_bdevs": 2, 00:28:40.132 "num_base_bdevs_discovered": 2, 00:28:40.132 "num_base_bdevs_operational": 2, 00:28:40.132 "process": { 00:28:40.132 "type": "rebuild", 00:28:40.132 "target": "spare", 00:28:40.132 "progress": { 00:28:40.132 "blocks": 3072, 00:28:40.132 "percent": 38 00:28:40.132 } 00:28:40.132 }, 00:28:40.132 "base_bdevs_list": [ 00:28:40.132 { 00:28:40.132 "name": "spare", 00:28:40.132 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:40.132 "is_configured": true, 00:28:40.132 "data_offset": 256, 00:28:40.132 "data_size": 7936 00:28:40.132 }, 00:28:40.132 { 00:28:40.132 "name": "BaseBdev2", 00:28:40.132 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:40.132 "is_configured": true, 00:28:40.132 "data_offset": 256, 00:28:40.132 "data_size": 7936 00:28:40.132 } 00:28:40.132 ] 00:28:40.132 }' 00:28:40.132 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:40.389 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1062 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.389 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.645 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:40.645 "name": "raid_bdev1", 00:28:40.645 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:40.645 "strip_size_kb": 0, 00:28:40.645 "state": "online", 00:28:40.645 "raid_level": "raid1", 00:28:40.645 "superblock": true, 00:28:40.645 "num_base_bdevs": 2, 00:28:40.645 "num_base_bdevs_discovered": 2, 00:28:40.645 "num_base_bdevs_operational": 2, 00:28:40.645 "process": { 00:28:40.645 "type": "rebuild", 00:28:40.645 "target": "spare", 00:28:40.645 "progress": { 00:28:40.645 "blocks": 3840, 00:28:40.645 "percent": 48 00:28:40.645 } 00:28:40.645 }, 00:28:40.646 "base_bdevs_list": [ 00:28:40.646 { 00:28:40.646 "name": "spare", 00:28:40.646 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:40.646 "is_configured": true, 00:28:40.646 "data_offset": 256, 00:28:40.646 "data_size": 7936 00:28:40.646 }, 00:28:40.646 { 00:28:40.646 "name": "BaseBdev2", 00:28:40.646 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:40.646 "is_configured": true, 00:28:40.646 "data_offset": 256, 00:28:40.646 "data_size": 7936 00:28:40.646 } 00:28:40.646 ] 00:28:40.646 }' 00:28:40.646 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:40.646 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:40.646 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:40.646 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:40.646 10:36:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:41.574 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:41.575 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:41.575 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:41.575 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:41.575 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:41.575 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:41.575 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:41.575 10:36:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:41.831 10:36:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:41.831 "name": "raid_bdev1", 00:28:41.831 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:41.831 "strip_size_kb": 0, 00:28:41.831 "state": "online", 00:28:41.831 "raid_level": "raid1", 00:28:41.831 "superblock": true, 00:28:41.831 "num_base_bdevs": 2, 00:28:41.831 "num_base_bdevs_discovered": 2, 00:28:41.831 "num_base_bdevs_operational": 2, 00:28:41.831 "process": { 00:28:41.831 "type": "rebuild", 00:28:41.831 "target": "spare", 00:28:41.831 "progress": { 00:28:41.831 "blocks": 7424, 00:28:41.831 "percent": 93 00:28:41.831 } 00:28:41.831 }, 00:28:41.831 "base_bdevs_list": [ 00:28:41.831 { 00:28:41.831 "name": "spare", 00:28:41.831 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:41.831 "is_configured": true, 00:28:41.831 "data_offset": 256, 00:28:41.831 "data_size": 7936 00:28:41.831 }, 00:28:41.831 { 00:28:41.831 "name": "BaseBdev2", 00:28:41.831 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:41.831 "is_configured": true, 00:28:41.831 "data_offset": 256, 00:28:41.831 "data_size": 7936 00:28:41.831 } 00:28:41.831 ] 00:28:41.831 }' 00:28:41.831 10:36:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:42.087 10:36:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:42.087 10:36:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:42.087 10:36:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:42.087 10:36:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:42.087 [2024-07-15 10:36:19.163190] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:42.087 [2024-07-15 10:36:19.163247] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:42.087 [2024-07-15 10:36:19.163328] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.027 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:43.288 "name": "raid_bdev1", 00:28:43.288 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:43.288 "strip_size_kb": 0, 00:28:43.288 "state": "online", 00:28:43.288 "raid_level": "raid1", 00:28:43.288 "superblock": true, 00:28:43.288 "num_base_bdevs": 2, 00:28:43.288 "num_base_bdevs_discovered": 2, 00:28:43.288 "num_base_bdevs_operational": 2, 00:28:43.288 "base_bdevs_list": [ 00:28:43.288 { 00:28:43.288 "name": "spare", 00:28:43.288 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:43.288 "is_configured": true, 00:28:43.288 "data_offset": 256, 00:28:43.288 "data_size": 7936 00:28:43.288 }, 00:28:43.288 { 00:28:43.288 "name": "BaseBdev2", 00:28:43.288 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:43.288 "is_configured": true, 00:28:43.288 "data_offset": 256, 00:28:43.288 "data_size": 7936 00:28:43.288 } 00:28:43.288 ] 00:28:43.288 }' 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.288 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:43.640 "name": "raid_bdev1", 00:28:43.640 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:43.640 "strip_size_kb": 0, 00:28:43.640 "state": "online", 00:28:43.640 "raid_level": "raid1", 00:28:43.640 "superblock": true, 00:28:43.640 "num_base_bdevs": 2, 00:28:43.640 "num_base_bdevs_discovered": 2, 00:28:43.640 "num_base_bdevs_operational": 2, 00:28:43.640 "base_bdevs_list": [ 00:28:43.640 { 00:28:43.640 "name": "spare", 00:28:43.640 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:43.640 "is_configured": true, 00:28:43.640 "data_offset": 256, 00:28:43.640 "data_size": 7936 00:28:43.640 }, 00:28:43.640 { 00:28:43.640 "name": "BaseBdev2", 00:28:43.640 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:43.640 "is_configured": true, 00:28:43.640 "data_offset": 256, 00:28:43.640 "data_size": 7936 00:28:43.640 } 00:28:43.640 ] 00:28:43.640 }' 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.640 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.896 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.896 "name": "raid_bdev1", 00:28:43.896 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:43.896 "strip_size_kb": 0, 00:28:43.896 "state": "online", 00:28:43.896 "raid_level": "raid1", 00:28:43.896 "superblock": true, 00:28:43.896 "num_base_bdevs": 2, 00:28:43.896 "num_base_bdevs_discovered": 2, 00:28:43.896 "num_base_bdevs_operational": 2, 00:28:43.896 "base_bdevs_list": [ 00:28:43.896 { 00:28:43.896 "name": "spare", 00:28:43.896 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:43.896 "is_configured": true, 00:28:43.896 "data_offset": 256, 00:28:43.896 "data_size": 7936 00:28:43.896 }, 00:28:43.896 { 00:28:43.896 "name": "BaseBdev2", 00:28:43.896 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:43.896 "is_configured": true, 00:28:43.896 "data_offset": 256, 00:28:43.896 "data_size": 7936 00:28:43.896 } 00:28:43.896 ] 00:28:43.896 }' 00:28:43.896 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.896 10:36:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:44.459 10:36:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:44.717 [2024-07-15 10:36:21.757907] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:44.717 [2024-07-15 10:36:21.757939] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:44.717 [2024-07-15 10:36:21.757997] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:44.717 [2024-07-15 10:36:21.758053] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:44.717 [2024-07-15 10:36:21.758065] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x251d1c0 name raid_bdev1, state offline 00:28:44.717 10:36:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.717 10:36:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:44.975 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:28:45.231 /dev/nbd0 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:45.231 1+0 records in 00:28:45.231 1+0 records out 00:28:45.231 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239901 s, 17.1 MB/s 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:45.231 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:28:45.488 /dev/nbd1 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:45.488 1+0 records in 00:28:45.488 1+0 records out 00:28:45.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032266 s, 12.7 MB/s 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:45.488 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:45.745 10:36:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:46.001 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:46.256 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:46.513 [2024-07-15 10:36:23.558400] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:46.513 [2024-07-15 10:36:23.558446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:46.513 [2024-07-15 10:36:23.558466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251c9d0 00:28:46.513 [2024-07-15 10:36:23.558479] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:46.513 [2024-07-15 10:36:23.559958] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:46.513 [2024-07-15 10:36:23.559987] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:46.513 [2024-07-15 10:36:23.560046] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:46.513 [2024-07-15 10:36:23.560074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:46.513 [2024-07-15 10:36:23.560167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:46.513 spare 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.513 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:46.513 [2024-07-15 10:36:23.660474] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x242e7c0 00:28:46.513 [2024-07-15 10:36:23.660489] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:46.513 [2024-07-15 10:36:23.660565] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x251ecd0 00:28:46.513 [2024-07-15 10:36:23.660688] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x242e7c0 00:28:46.513 [2024-07-15 10:36:23.660698] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x242e7c0 00:28:46.513 [2024-07-15 10:36:23.660778] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:46.770 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:46.770 "name": "raid_bdev1", 00:28:46.770 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:46.770 "strip_size_kb": 0, 00:28:46.770 "state": "online", 00:28:46.770 "raid_level": "raid1", 00:28:46.770 "superblock": true, 00:28:46.770 "num_base_bdevs": 2, 00:28:46.770 "num_base_bdevs_discovered": 2, 00:28:46.770 "num_base_bdevs_operational": 2, 00:28:46.770 "base_bdevs_list": [ 00:28:46.770 { 00:28:46.770 "name": "spare", 00:28:46.770 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:46.770 "is_configured": true, 00:28:46.770 "data_offset": 256, 00:28:46.770 "data_size": 7936 00:28:46.770 }, 00:28:46.770 { 00:28:46.770 "name": "BaseBdev2", 00:28:46.770 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:46.770 "is_configured": true, 00:28:46.770 "data_offset": 256, 00:28:46.770 "data_size": 7936 00:28:46.770 } 00:28:46.770 ] 00:28:46.770 }' 00:28:46.770 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:46.770 10:36:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:47.333 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:47.333 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:47.333 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:47.333 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:47.333 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:47.333 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.333 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:47.590 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:47.590 "name": "raid_bdev1", 00:28:47.590 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:47.590 "strip_size_kb": 0, 00:28:47.590 "state": "online", 00:28:47.590 "raid_level": "raid1", 00:28:47.590 "superblock": true, 00:28:47.590 "num_base_bdevs": 2, 00:28:47.590 "num_base_bdevs_discovered": 2, 00:28:47.590 "num_base_bdevs_operational": 2, 00:28:47.590 "base_bdevs_list": [ 00:28:47.590 { 00:28:47.590 "name": "spare", 00:28:47.590 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:47.590 "is_configured": true, 00:28:47.590 "data_offset": 256, 00:28:47.590 "data_size": 7936 00:28:47.590 }, 00:28:47.590 { 00:28:47.590 "name": "BaseBdev2", 00:28:47.590 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:47.590 "is_configured": true, 00:28:47.590 "data_offset": 256, 00:28:47.590 "data_size": 7936 00:28:47.590 } 00:28:47.590 ] 00:28:47.590 }' 00:28:47.590 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:47.590 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:47.590 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:47.590 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:47.590 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:47.591 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:47.847 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:47.847 10:36:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:48.105 [2024-07-15 10:36:25.154755] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:48.105 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:48.362 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:48.362 "name": "raid_bdev1", 00:28:48.362 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:48.362 "strip_size_kb": 0, 00:28:48.362 "state": "online", 00:28:48.362 "raid_level": "raid1", 00:28:48.362 "superblock": true, 00:28:48.362 "num_base_bdevs": 2, 00:28:48.362 "num_base_bdevs_discovered": 1, 00:28:48.362 "num_base_bdevs_operational": 1, 00:28:48.362 "base_bdevs_list": [ 00:28:48.362 { 00:28:48.362 "name": null, 00:28:48.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:48.362 "is_configured": false, 00:28:48.362 "data_offset": 256, 00:28:48.362 "data_size": 7936 00:28:48.362 }, 00:28:48.362 { 00:28:48.362 "name": "BaseBdev2", 00:28:48.362 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:48.362 "is_configured": true, 00:28:48.362 "data_offset": 256, 00:28:48.362 "data_size": 7936 00:28:48.362 } 00:28:48.362 ] 00:28:48.362 }' 00:28:48.362 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:48.362 10:36:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:48.927 10:36:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:49.184 [2024-07-15 10:36:26.233620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:49.184 [2024-07-15 10:36:26.233778] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:49.184 [2024-07-15 10:36:26.233794] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:49.184 [2024-07-15 10:36:26.233821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:49.184 [2024-07-15 10:36:26.236042] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c71d0 00:28:49.184 [2024-07-15 10:36:26.237381] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:49.184 10:36:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:50.114 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:50.114 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:50.114 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:50.114 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:50.114 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:50.114 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:50.114 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.371 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:50.371 "name": "raid_bdev1", 00:28:50.371 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:50.371 "strip_size_kb": 0, 00:28:50.371 "state": "online", 00:28:50.371 "raid_level": "raid1", 00:28:50.371 "superblock": true, 00:28:50.371 "num_base_bdevs": 2, 00:28:50.371 "num_base_bdevs_discovered": 2, 00:28:50.371 "num_base_bdevs_operational": 2, 00:28:50.371 "process": { 00:28:50.371 "type": "rebuild", 00:28:50.371 "target": "spare", 00:28:50.371 "progress": { 00:28:50.371 "blocks": 3072, 00:28:50.371 "percent": 38 00:28:50.371 } 00:28:50.371 }, 00:28:50.371 "base_bdevs_list": [ 00:28:50.371 { 00:28:50.371 "name": "spare", 00:28:50.371 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:50.371 "is_configured": true, 00:28:50.371 "data_offset": 256, 00:28:50.371 "data_size": 7936 00:28:50.371 }, 00:28:50.371 { 00:28:50.371 "name": "BaseBdev2", 00:28:50.371 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:50.371 "is_configured": true, 00:28:50.371 "data_offset": 256, 00:28:50.371 "data_size": 7936 00:28:50.371 } 00:28:50.371 ] 00:28:50.371 }' 00:28:50.371 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:50.371 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:50.371 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:50.628 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:50.628 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:50.628 [2024-07-15 10:36:27.823890] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:50.885 [2024-07-15 10:36:27.850211] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:50.885 [2024-07-15 10:36:27.850267] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:50.885 [2024-07-15 10:36:27.850283] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:50.885 [2024-07-15 10:36:27.850292] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:50.885 10:36:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.142 10:36:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:51.142 "name": "raid_bdev1", 00:28:51.142 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:51.142 "strip_size_kb": 0, 00:28:51.142 "state": "online", 00:28:51.142 "raid_level": "raid1", 00:28:51.142 "superblock": true, 00:28:51.142 "num_base_bdevs": 2, 00:28:51.142 "num_base_bdevs_discovered": 1, 00:28:51.142 "num_base_bdevs_operational": 1, 00:28:51.142 "base_bdevs_list": [ 00:28:51.142 { 00:28:51.142 "name": null, 00:28:51.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:51.142 "is_configured": false, 00:28:51.142 "data_offset": 256, 00:28:51.142 "data_size": 7936 00:28:51.142 }, 00:28:51.142 { 00:28:51.142 "name": "BaseBdev2", 00:28:51.142 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:51.142 "is_configured": true, 00:28:51.142 "data_offset": 256, 00:28:51.142 "data_size": 7936 00:28:51.142 } 00:28:51.142 ] 00:28:51.142 }' 00:28:51.142 10:36:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:51.142 10:36:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:51.705 10:36:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:51.962 [2024-07-15 10:36:28.964372] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:51.962 [2024-07-15 10:36:28.964424] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:51.962 [2024-07-15 10:36:28.964447] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2561810 00:28:51.962 [2024-07-15 10:36:28.964461] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:51.962 [2024-07-15 10:36:28.964682] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:51.962 [2024-07-15 10:36:28.964699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:51.962 [2024-07-15 10:36:28.964758] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:51.962 [2024-07-15 10:36:28.964772] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:51.962 [2024-07-15 10:36:28.964784] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:51.962 [2024-07-15 10:36:28.964803] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:51.962 [2024-07-15 10:36:28.967032] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23c7980 00:28:51.962 [2024-07-15 10:36:28.968362] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:51.962 spare 00:28:51.962 10:36:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:52.893 10:36:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:52.893 10:36:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:52.893 10:36:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:52.893 10:36:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:52.893 10:36:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:52.893 10:36:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:52.893 10:36:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.150 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:53.150 "name": "raid_bdev1", 00:28:53.150 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:53.150 "strip_size_kb": 0, 00:28:53.150 "state": "online", 00:28:53.150 "raid_level": "raid1", 00:28:53.150 "superblock": true, 00:28:53.150 "num_base_bdevs": 2, 00:28:53.150 "num_base_bdevs_discovered": 2, 00:28:53.150 "num_base_bdevs_operational": 2, 00:28:53.150 "process": { 00:28:53.150 "type": "rebuild", 00:28:53.150 "target": "spare", 00:28:53.150 "progress": { 00:28:53.150 "blocks": 3072, 00:28:53.150 "percent": 38 00:28:53.150 } 00:28:53.150 }, 00:28:53.150 "base_bdevs_list": [ 00:28:53.150 { 00:28:53.150 "name": "spare", 00:28:53.150 "uuid": "3a12502b-c25d-546d-b1b9-d6c8d532d758", 00:28:53.150 "is_configured": true, 00:28:53.150 "data_offset": 256, 00:28:53.150 "data_size": 7936 00:28:53.150 }, 00:28:53.150 { 00:28:53.150 "name": "BaseBdev2", 00:28:53.150 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:53.150 "is_configured": true, 00:28:53.150 "data_offset": 256, 00:28:53.150 "data_size": 7936 00:28:53.150 } 00:28:53.150 ] 00:28:53.150 }' 00:28:53.150 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:53.150 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:53.150 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:53.150 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:53.150 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:53.407 [2024-07-15 10:36:30.565600] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:53.408 [2024-07-15 10:36:30.580844] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:53.408 [2024-07-15 10:36:30.580892] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:53.408 [2024-07-15 10:36:30.580908] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:53.408 [2024-07-15 10:36:30.580916] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:53.665 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:53.922 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:53.922 "name": "raid_bdev1", 00:28:53.922 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:53.922 "strip_size_kb": 0, 00:28:53.922 "state": "online", 00:28:53.922 "raid_level": "raid1", 00:28:53.922 "superblock": true, 00:28:53.922 "num_base_bdevs": 2, 00:28:53.922 "num_base_bdevs_discovered": 1, 00:28:53.922 "num_base_bdevs_operational": 1, 00:28:53.922 "base_bdevs_list": [ 00:28:53.922 { 00:28:53.922 "name": null, 00:28:53.922 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:53.922 "is_configured": false, 00:28:53.922 "data_offset": 256, 00:28:53.922 "data_size": 7936 00:28:53.922 }, 00:28:53.922 { 00:28:53.922 "name": "BaseBdev2", 00:28:53.922 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:53.922 "is_configured": true, 00:28:53.922 "data_offset": 256, 00:28:53.922 "data_size": 7936 00:28:53.922 } 00:28:53.922 ] 00:28:53.922 }' 00:28:53.922 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:53.922 10:36:30 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:54.486 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:54.486 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:54.486 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:54.486 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:54.486 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:54.486 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.486 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.743 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:54.743 "name": "raid_bdev1", 00:28:54.743 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:54.743 "strip_size_kb": 0, 00:28:54.743 "state": "online", 00:28:54.743 "raid_level": "raid1", 00:28:54.743 "superblock": true, 00:28:54.743 "num_base_bdevs": 2, 00:28:54.743 "num_base_bdevs_discovered": 1, 00:28:54.743 "num_base_bdevs_operational": 1, 00:28:54.743 "base_bdevs_list": [ 00:28:54.743 { 00:28:54.743 "name": null, 00:28:54.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:54.743 "is_configured": false, 00:28:54.743 "data_offset": 256, 00:28:54.743 "data_size": 7936 00:28:54.743 }, 00:28:54.743 { 00:28:54.743 "name": "BaseBdev2", 00:28:54.743 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:54.743 "is_configured": true, 00:28:54.743 "data_offset": 256, 00:28:54.743 "data_size": 7936 00:28:54.743 } 00:28:54.743 ] 00:28:54.743 }' 00:28:54.743 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:54.743 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:54.743 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:54.744 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:54.744 10:36:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:55.000 10:36:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:55.257 [2024-07-15 10:36:32.277000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:55.257 [2024-07-15 10:36:32.277053] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:55.258 [2024-07-15 10:36:32.277076] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23c8900 00:28:55.258 [2024-07-15 10:36:32.277090] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:55.258 [2024-07-15 10:36:32.277283] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:55.258 [2024-07-15 10:36:32.277299] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:55.258 [2024-07-15 10:36:32.277347] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:55.258 [2024-07-15 10:36:32.277360] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:55.258 [2024-07-15 10:36:32.277371] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:55.258 BaseBdev1 00:28:55.258 10:36:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.187 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.444 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:56.444 "name": "raid_bdev1", 00:28:56.444 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:56.444 "strip_size_kb": 0, 00:28:56.444 "state": "online", 00:28:56.444 "raid_level": "raid1", 00:28:56.444 "superblock": true, 00:28:56.444 "num_base_bdevs": 2, 00:28:56.444 "num_base_bdevs_discovered": 1, 00:28:56.444 "num_base_bdevs_operational": 1, 00:28:56.444 "base_bdevs_list": [ 00:28:56.444 { 00:28:56.444 "name": null, 00:28:56.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:56.444 "is_configured": false, 00:28:56.444 "data_offset": 256, 00:28:56.444 "data_size": 7936 00:28:56.444 }, 00:28:56.444 { 00:28:56.444 "name": "BaseBdev2", 00:28:56.444 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:56.444 "is_configured": true, 00:28:56.444 "data_offset": 256, 00:28:56.444 "data_size": 7936 00:28:56.444 } 00:28:56.444 ] 00:28:56.444 }' 00:28:56.444 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:56.444 10:36:33 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:57.006 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:57.006 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:57.007 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:57.007 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:57.007 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:57.007 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.007 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:57.263 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:57.263 "name": "raid_bdev1", 00:28:57.263 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:57.263 "strip_size_kb": 0, 00:28:57.263 "state": "online", 00:28:57.263 "raid_level": "raid1", 00:28:57.263 "superblock": true, 00:28:57.263 "num_base_bdevs": 2, 00:28:57.263 "num_base_bdevs_discovered": 1, 00:28:57.263 "num_base_bdevs_operational": 1, 00:28:57.263 "base_bdevs_list": [ 00:28:57.263 { 00:28:57.263 "name": null, 00:28:57.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.263 "is_configured": false, 00:28:57.263 "data_offset": 256, 00:28:57.263 "data_size": 7936 00:28:57.263 }, 00:28:57.263 { 00:28:57.263 "name": "BaseBdev2", 00:28:57.263 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:57.263 "is_configured": true, 00:28:57.263 "data_offset": 256, 00:28:57.263 "data_size": 7936 00:28:57.263 } 00:28:57.263 ] 00:28:57.263 }' 00:28:57.263 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:57.263 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:57.263 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:57.520 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:57.777 [2024-07-15 10:36:34.719492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:57.777 [2024-07-15 10:36:34.719620] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:57.777 [2024-07-15 10:36:34.719635] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:57.777 request: 00:28:57.777 { 00:28:57.777 "base_bdev": "BaseBdev1", 00:28:57.777 "raid_bdev": "raid_bdev1", 00:28:57.777 "method": "bdev_raid_add_base_bdev", 00:28:57.777 "req_id": 1 00:28:57.777 } 00:28:57.777 Got JSON-RPC error response 00:28:57.777 response: 00:28:57.777 { 00:28:57.777 "code": -22, 00:28:57.777 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:57.777 } 00:28:57.777 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:28:57.777 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:57.777 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:57.777 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:57.777 10:36:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:58.707 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:58.708 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.708 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.964 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:58.964 "name": "raid_bdev1", 00:28:58.964 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:58.964 "strip_size_kb": 0, 00:28:58.964 "state": "online", 00:28:58.964 "raid_level": "raid1", 00:28:58.965 "superblock": true, 00:28:58.965 "num_base_bdevs": 2, 00:28:58.965 "num_base_bdevs_discovered": 1, 00:28:58.965 "num_base_bdevs_operational": 1, 00:28:58.965 "base_bdevs_list": [ 00:28:58.965 { 00:28:58.965 "name": null, 00:28:58.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.965 "is_configured": false, 00:28:58.965 "data_offset": 256, 00:28:58.965 "data_size": 7936 00:28:58.965 }, 00:28:58.965 { 00:28:58.965 "name": "BaseBdev2", 00:28:58.965 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:58.965 "is_configured": true, 00:28:58.965 "data_offset": 256, 00:28:58.965 "data_size": 7936 00:28:58.965 } 00:28:58.965 ] 00:28:58.965 }' 00:28:58.965 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:58.965 10:36:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:28:59.528 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:59.528 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:59.528 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:59.528 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:59.528 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:59.528 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.528 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:59.785 "name": "raid_bdev1", 00:28:59.785 "uuid": "b54a4b67-4494-424d-979c-ced8d21714dd", 00:28:59.785 "strip_size_kb": 0, 00:28:59.785 "state": "online", 00:28:59.785 "raid_level": "raid1", 00:28:59.785 "superblock": true, 00:28:59.785 "num_base_bdevs": 2, 00:28:59.785 "num_base_bdevs_discovered": 1, 00:28:59.785 "num_base_bdevs_operational": 1, 00:28:59.785 "base_bdevs_list": [ 00:28:59.785 { 00:28:59.785 "name": null, 00:28:59.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:59.785 "is_configured": false, 00:28:59.785 "data_offset": 256, 00:28:59.785 "data_size": 7936 00:28:59.785 }, 00:28:59.785 { 00:28:59.785 "name": "BaseBdev2", 00:28:59.785 "uuid": "eff001e7-9ee3-5b72-8a89-da424e6268aa", 00:28:59.785 "is_configured": true, 00:28:59.785 "data_offset": 256, 00:28:59.785 "data_size": 7936 00:28:59.785 } 00:28:59.785 ] 00:28:59.785 }' 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 625040 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 625040 ']' 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 625040 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:59.785 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 625040 00:28:59.786 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:59.786 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:59.786 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 625040' 00:28:59.786 killing process with pid 625040 00:28:59.786 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 625040 00:28:59.786 Received shutdown signal, test time was about 60.000000 seconds 00:28:59.786 00:28:59.786 Latency(us) 00:28:59.786 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:59.786 =================================================================================================================== 00:28:59.786 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:59.786 [2024-07-15 10:36:36.968285] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:59.786 [2024-07-15 10:36:36.968372] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:59.786 [2024-07-15 10:36:36.968418] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:59.786 [2024-07-15 10:36:36.968431] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x242e7c0 name raid_bdev1, state offline 00:28:59.786 10:36:36 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 625040 00:29:00.044 [2024-07-15 10:36:37.001488] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:00.044 10:36:37 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:29:00.044 00:29:00.044 real 0m31.612s 00:29:00.044 user 0m49.393s 00:29:00.044 sys 0m5.097s 00:29:00.044 10:36:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:00.044 10:36:37 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:00.044 ************************************ 00:29:00.044 END TEST raid_rebuild_test_sb_md_separate 00:29:00.044 ************************************ 00:29:00.302 10:36:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:00.302 10:36:37 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:29:00.302 10:36:37 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:29:00.302 10:36:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:00.302 10:36:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:00.302 10:36:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:00.302 ************************************ 00:29:00.302 START TEST raid_state_function_test_sb_md_interleaved 00:29:00.302 ************************************ 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=629678 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 629678' 00:29:00.302 Process raid pid: 629678 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 629678 /var/tmp/spdk-raid.sock 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 629678 ']' 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:00.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:00.302 10:36:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:00.302 [2024-07-15 10:36:37.355854] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:29:00.302 [2024-07-15 10:36:37.355919] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:00.302 [2024-07-15 10:36:37.482966] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:00.559 [2024-07-15 10:36:37.586785] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:00.559 [2024-07-15 10:36:37.651914] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:00.559 [2024-07-15 10:36:37.651959] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:01.121 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:01.121 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:01.121 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:01.377 [2024-07-15 10:36:38.507602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:01.377 [2024-07-15 10:36:38.507647] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:01.377 [2024-07-15 10:36:38.507658] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:01.377 [2024-07-15 10:36:38.507670] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.377 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:01.633 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:01.633 "name": "Existed_Raid", 00:29:01.633 "uuid": "31b41a19-e85c-4aaf-b550-824ad3bdcc6f", 00:29:01.633 "strip_size_kb": 0, 00:29:01.633 "state": "configuring", 00:29:01.633 "raid_level": "raid1", 00:29:01.633 "superblock": true, 00:29:01.633 "num_base_bdevs": 2, 00:29:01.633 "num_base_bdevs_discovered": 0, 00:29:01.633 "num_base_bdevs_operational": 2, 00:29:01.633 "base_bdevs_list": [ 00:29:01.633 { 00:29:01.633 "name": "BaseBdev1", 00:29:01.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:01.633 "is_configured": false, 00:29:01.633 "data_offset": 0, 00:29:01.633 "data_size": 0 00:29:01.633 }, 00:29:01.633 { 00:29:01.633 "name": "BaseBdev2", 00:29:01.633 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:01.633 "is_configured": false, 00:29:01.633 "data_offset": 0, 00:29:01.633 "data_size": 0 00:29:01.633 } 00:29:01.633 ] 00:29:01.633 }' 00:29:01.633 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:01.633 10:36:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:02.228 10:36:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:02.484 [2024-07-15 10:36:39.566279] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:02.484 [2024-07-15 10:36:39.566314] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2552a80 name Existed_Raid, state configuring 00:29:02.484 10:36:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:02.740 [2024-07-15 10:36:39.810953] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:02.740 [2024-07-15 10:36:39.810990] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:02.740 [2024-07-15 10:36:39.811001] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:02.740 [2024-07-15 10:36:39.811013] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:02.740 10:36:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:29:02.996 [2024-07-15 10:36:40.053826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:02.996 BaseBdev1 00:29:02.996 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:02.996 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:29:02.996 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:02.996 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:02.996 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:02.996 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:02.996 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:03.253 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:03.509 [ 00:29:03.509 { 00:29:03.509 "name": "BaseBdev1", 00:29:03.509 "aliases": [ 00:29:03.509 "4ba8422e-101a-4d76-a3a1-a73046195425" 00:29:03.509 ], 00:29:03.509 "product_name": "Malloc disk", 00:29:03.509 "block_size": 4128, 00:29:03.509 "num_blocks": 8192, 00:29:03.509 "uuid": "4ba8422e-101a-4d76-a3a1-a73046195425", 00:29:03.509 "md_size": 32, 00:29:03.509 "md_interleave": true, 00:29:03.509 "dif_type": 0, 00:29:03.509 "assigned_rate_limits": { 00:29:03.509 "rw_ios_per_sec": 0, 00:29:03.509 "rw_mbytes_per_sec": 0, 00:29:03.509 "r_mbytes_per_sec": 0, 00:29:03.509 "w_mbytes_per_sec": 0 00:29:03.509 }, 00:29:03.509 "claimed": true, 00:29:03.509 "claim_type": "exclusive_write", 00:29:03.509 "zoned": false, 00:29:03.509 "supported_io_types": { 00:29:03.509 "read": true, 00:29:03.509 "write": true, 00:29:03.509 "unmap": true, 00:29:03.509 "flush": true, 00:29:03.509 "reset": true, 00:29:03.509 "nvme_admin": false, 00:29:03.509 "nvme_io": false, 00:29:03.509 "nvme_io_md": false, 00:29:03.509 "write_zeroes": true, 00:29:03.509 "zcopy": true, 00:29:03.509 "get_zone_info": false, 00:29:03.509 "zone_management": false, 00:29:03.509 "zone_append": false, 00:29:03.509 "compare": false, 00:29:03.509 "compare_and_write": false, 00:29:03.509 "abort": true, 00:29:03.509 "seek_hole": false, 00:29:03.509 "seek_data": false, 00:29:03.509 "copy": true, 00:29:03.509 "nvme_iov_md": false 00:29:03.509 }, 00:29:03.509 "memory_domains": [ 00:29:03.509 { 00:29:03.509 "dma_device_id": "system", 00:29:03.509 "dma_device_type": 1 00:29:03.509 }, 00:29:03.509 { 00:29:03.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:03.509 "dma_device_type": 2 00:29:03.509 } 00:29:03.509 ], 00:29:03.509 "driver_specific": {} 00:29:03.509 } 00:29:03.509 ] 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.509 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:03.766 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:03.766 "name": "Existed_Raid", 00:29:03.766 "uuid": "0063acc5-e507-4268-afe6-ad9d62ee98fc", 00:29:03.766 "strip_size_kb": 0, 00:29:03.766 "state": "configuring", 00:29:03.766 "raid_level": "raid1", 00:29:03.766 "superblock": true, 00:29:03.766 "num_base_bdevs": 2, 00:29:03.766 "num_base_bdevs_discovered": 1, 00:29:03.766 "num_base_bdevs_operational": 2, 00:29:03.766 "base_bdevs_list": [ 00:29:03.766 { 00:29:03.766 "name": "BaseBdev1", 00:29:03.766 "uuid": "4ba8422e-101a-4d76-a3a1-a73046195425", 00:29:03.766 "is_configured": true, 00:29:03.766 "data_offset": 256, 00:29:03.766 "data_size": 7936 00:29:03.766 }, 00:29:03.766 { 00:29:03.766 "name": "BaseBdev2", 00:29:03.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:03.766 "is_configured": false, 00:29:03.766 "data_offset": 0, 00:29:03.766 "data_size": 0 00:29:03.766 } 00:29:03.766 ] 00:29:03.766 }' 00:29:03.766 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:03.766 10:36:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:04.330 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:04.587 [2024-07-15 10:36:41.593950] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:04.587 [2024-07-15 10:36:41.593994] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2552350 name Existed_Raid, state configuring 00:29:04.587 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:04.587 [2024-07-15 10:36:41.766436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:04.587 [2024-07-15 10:36:41.767915] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:04.587 [2024-07-15 10:36:41.767955] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:04.843 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:04.843 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:04.843 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:04.843 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:04.843 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:04.843 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.844 10:36:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:04.844 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:04.844 "name": "Existed_Raid", 00:29:04.844 "uuid": "6f42f529-7335-427c-af03-5ab092d134ef", 00:29:04.844 "strip_size_kb": 0, 00:29:04.844 "state": "configuring", 00:29:04.844 "raid_level": "raid1", 00:29:04.844 "superblock": true, 00:29:04.844 "num_base_bdevs": 2, 00:29:04.844 "num_base_bdevs_discovered": 1, 00:29:04.844 "num_base_bdevs_operational": 2, 00:29:04.844 "base_bdevs_list": [ 00:29:04.844 { 00:29:04.844 "name": "BaseBdev1", 00:29:04.844 "uuid": "4ba8422e-101a-4d76-a3a1-a73046195425", 00:29:04.844 "is_configured": true, 00:29:04.844 "data_offset": 256, 00:29:04.844 "data_size": 7936 00:29:04.844 }, 00:29:04.844 { 00:29:04.844 "name": "BaseBdev2", 00:29:04.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:04.844 "is_configured": false, 00:29:04.844 "data_offset": 0, 00:29:04.844 "data_size": 0 00:29:04.844 } 00:29:04.844 ] 00:29:04.844 }' 00:29:04.844 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:04.844 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:29:05.774 [2024-07-15 10:36:42.756633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:05.774 [2024-07-15 10:36:42.756770] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2554180 00:29:05.774 [2024-07-15 10:36:42.756783] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:05.774 [2024-07-15 10:36:42.756842] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2554150 00:29:05.774 [2024-07-15 10:36:42.756918] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2554180 00:29:05.774 [2024-07-15 10:36:42.756943] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2554180 00:29:05.774 [2024-07-15 10:36:42.757003] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:05.774 BaseBdev2 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:05.774 10:36:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:06.032 [ 00:29:06.032 { 00:29:06.032 "name": "BaseBdev2", 00:29:06.032 "aliases": [ 00:29:06.032 "8143e325-45f2-4c10-bf34-b248fa923d3f" 00:29:06.032 ], 00:29:06.032 "product_name": "Malloc disk", 00:29:06.032 "block_size": 4128, 00:29:06.032 "num_blocks": 8192, 00:29:06.032 "uuid": "8143e325-45f2-4c10-bf34-b248fa923d3f", 00:29:06.032 "md_size": 32, 00:29:06.032 "md_interleave": true, 00:29:06.032 "dif_type": 0, 00:29:06.032 "assigned_rate_limits": { 00:29:06.032 "rw_ios_per_sec": 0, 00:29:06.032 "rw_mbytes_per_sec": 0, 00:29:06.032 "r_mbytes_per_sec": 0, 00:29:06.032 "w_mbytes_per_sec": 0 00:29:06.032 }, 00:29:06.032 "claimed": true, 00:29:06.032 "claim_type": "exclusive_write", 00:29:06.032 "zoned": false, 00:29:06.032 "supported_io_types": { 00:29:06.032 "read": true, 00:29:06.032 "write": true, 00:29:06.032 "unmap": true, 00:29:06.032 "flush": true, 00:29:06.032 "reset": true, 00:29:06.032 "nvme_admin": false, 00:29:06.032 "nvme_io": false, 00:29:06.032 "nvme_io_md": false, 00:29:06.032 "write_zeroes": true, 00:29:06.032 "zcopy": true, 00:29:06.032 "get_zone_info": false, 00:29:06.032 "zone_management": false, 00:29:06.032 "zone_append": false, 00:29:06.032 "compare": false, 00:29:06.032 "compare_and_write": false, 00:29:06.032 "abort": true, 00:29:06.032 "seek_hole": false, 00:29:06.032 "seek_data": false, 00:29:06.032 "copy": true, 00:29:06.032 "nvme_iov_md": false 00:29:06.032 }, 00:29:06.032 "memory_domains": [ 00:29:06.032 { 00:29:06.032 "dma_device_id": "system", 00:29:06.032 "dma_device_type": 1 00:29:06.032 }, 00:29:06.032 { 00:29:06.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:06.032 "dma_device_type": 2 00:29:06.032 } 00:29:06.032 ], 00:29:06.032 "driver_specific": {} 00:29:06.032 } 00:29:06.032 ] 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.032 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:06.288 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.288 "name": "Existed_Raid", 00:29:06.288 "uuid": "6f42f529-7335-427c-af03-5ab092d134ef", 00:29:06.288 "strip_size_kb": 0, 00:29:06.288 "state": "online", 00:29:06.288 "raid_level": "raid1", 00:29:06.288 "superblock": true, 00:29:06.288 "num_base_bdevs": 2, 00:29:06.288 "num_base_bdevs_discovered": 2, 00:29:06.288 "num_base_bdevs_operational": 2, 00:29:06.288 "base_bdevs_list": [ 00:29:06.288 { 00:29:06.288 "name": "BaseBdev1", 00:29:06.288 "uuid": "4ba8422e-101a-4d76-a3a1-a73046195425", 00:29:06.288 "is_configured": true, 00:29:06.288 "data_offset": 256, 00:29:06.288 "data_size": 7936 00:29:06.288 }, 00:29:06.288 { 00:29:06.288 "name": "BaseBdev2", 00:29:06.288 "uuid": "8143e325-45f2-4c10-bf34-b248fa923d3f", 00:29:06.288 "is_configured": true, 00:29:06.288 "data_offset": 256, 00:29:06.288 "data_size": 7936 00:29:06.288 } 00:29:06.288 ] 00:29:06.288 }' 00:29:06.289 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.289 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:06.852 10:36:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:07.109 [2024-07-15 10:36:44.168680] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:07.109 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:07.109 "name": "Existed_Raid", 00:29:07.109 "aliases": [ 00:29:07.109 "6f42f529-7335-427c-af03-5ab092d134ef" 00:29:07.109 ], 00:29:07.109 "product_name": "Raid Volume", 00:29:07.109 "block_size": 4128, 00:29:07.109 "num_blocks": 7936, 00:29:07.109 "uuid": "6f42f529-7335-427c-af03-5ab092d134ef", 00:29:07.109 "md_size": 32, 00:29:07.109 "md_interleave": true, 00:29:07.109 "dif_type": 0, 00:29:07.109 "assigned_rate_limits": { 00:29:07.109 "rw_ios_per_sec": 0, 00:29:07.109 "rw_mbytes_per_sec": 0, 00:29:07.109 "r_mbytes_per_sec": 0, 00:29:07.109 "w_mbytes_per_sec": 0 00:29:07.109 }, 00:29:07.109 "claimed": false, 00:29:07.109 "zoned": false, 00:29:07.109 "supported_io_types": { 00:29:07.109 "read": true, 00:29:07.109 "write": true, 00:29:07.109 "unmap": false, 00:29:07.109 "flush": false, 00:29:07.109 "reset": true, 00:29:07.109 "nvme_admin": false, 00:29:07.109 "nvme_io": false, 00:29:07.109 "nvme_io_md": false, 00:29:07.109 "write_zeroes": true, 00:29:07.109 "zcopy": false, 00:29:07.109 "get_zone_info": false, 00:29:07.109 "zone_management": false, 00:29:07.109 "zone_append": false, 00:29:07.109 "compare": false, 00:29:07.109 "compare_and_write": false, 00:29:07.109 "abort": false, 00:29:07.109 "seek_hole": false, 00:29:07.109 "seek_data": false, 00:29:07.109 "copy": false, 00:29:07.109 "nvme_iov_md": false 00:29:07.109 }, 00:29:07.110 "memory_domains": [ 00:29:07.110 { 00:29:07.110 "dma_device_id": "system", 00:29:07.110 "dma_device_type": 1 00:29:07.110 }, 00:29:07.110 { 00:29:07.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.110 "dma_device_type": 2 00:29:07.110 }, 00:29:07.110 { 00:29:07.110 "dma_device_id": "system", 00:29:07.110 "dma_device_type": 1 00:29:07.110 }, 00:29:07.110 { 00:29:07.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.110 "dma_device_type": 2 00:29:07.110 } 00:29:07.110 ], 00:29:07.110 "driver_specific": { 00:29:07.110 "raid": { 00:29:07.110 "uuid": "6f42f529-7335-427c-af03-5ab092d134ef", 00:29:07.110 "strip_size_kb": 0, 00:29:07.110 "state": "online", 00:29:07.110 "raid_level": "raid1", 00:29:07.110 "superblock": true, 00:29:07.110 "num_base_bdevs": 2, 00:29:07.110 "num_base_bdevs_discovered": 2, 00:29:07.110 "num_base_bdevs_operational": 2, 00:29:07.110 "base_bdevs_list": [ 00:29:07.110 { 00:29:07.110 "name": "BaseBdev1", 00:29:07.110 "uuid": "4ba8422e-101a-4d76-a3a1-a73046195425", 00:29:07.110 "is_configured": true, 00:29:07.110 "data_offset": 256, 00:29:07.110 "data_size": 7936 00:29:07.110 }, 00:29:07.110 { 00:29:07.110 "name": "BaseBdev2", 00:29:07.110 "uuid": "8143e325-45f2-4c10-bf34-b248fa923d3f", 00:29:07.110 "is_configured": true, 00:29:07.110 "data_offset": 256, 00:29:07.110 "data_size": 7936 00:29:07.110 } 00:29:07.110 ] 00:29:07.110 } 00:29:07.110 } 00:29:07.110 }' 00:29:07.110 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:07.110 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:07.110 BaseBdev2' 00:29:07.110 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:07.110 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:07.110 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:07.367 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:07.367 "name": "BaseBdev1", 00:29:07.367 "aliases": [ 00:29:07.367 "4ba8422e-101a-4d76-a3a1-a73046195425" 00:29:07.367 ], 00:29:07.367 "product_name": "Malloc disk", 00:29:07.367 "block_size": 4128, 00:29:07.367 "num_blocks": 8192, 00:29:07.367 "uuid": "4ba8422e-101a-4d76-a3a1-a73046195425", 00:29:07.367 "md_size": 32, 00:29:07.367 "md_interleave": true, 00:29:07.367 "dif_type": 0, 00:29:07.367 "assigned_rate_limits": { 00:29:07.367 "rw_ios_per_sec": 0, 00:29:07.367 "rw_mbytes_per_sec": 0, 00:29:07.367 "r_mbytes_per_sec": 0, 00:29:07.367 "w_mbytes_per_sec": 0 00:29:07.367 }, 00:29:07.367 "claimed": true, 00:29:07.367 "claim_type": "exclusive_write", 00:29:07.367 "zoned": false, 00:29:07.367 "supported_io_types": { 00:29:07.367 "read": true, 00:29:07.367 "write": true, 00:29:07.367 "unmap": true, 00:29:07.367 "flush": true, 00:29:07.367 "reset": true, 00:29:07.367 "nvme_admin": false, 00:29:07.367 "nvme_io": false, 00:29:07.367 "nvme_io_md": false, 00:29:07.367 "write_zeroes": true, 00:29:07.367 "zcopy": true, 00:29:07.367 "get_zone_info": false, 00:29:07.367 "zone_management": false, 00:29:07.367 "zone_append": false, 00:29:07.367 "compare": false, 00:29:07.367 "compare_and_write": false, 00:29:07.367 "abort": true, 00:29:07.367 "seek_hole": false, 00:29:07.367 "seek_data": false, 00:29:07.367 "copy": true, 00:29:07.367 "nvme_iov_md": false 00:29:07.367 }, 00:29:07.367 "memory_domains": [ 00:29:07.367 { 00:29:07.367 "dma_device_id": "system", 00:29:07.367 "dma_device_type": 1 00:29:07.367 }, 00:29:07.367 { 00:29:07.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.367 "dma_device_type": 2 00:29:07.367 } 00:29:07.367 ], 00:29:07.367 "driver_specific": {} 00:29:07.367 }' 00:29:07.367 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:07.367 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:07.624 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:07.880 10:36:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:07.880 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:07.880 "name": "BaseBdev2", 00:29:07.880 "aliases": [ 00:29:07.880 "8143e325-45f2-4c10-bf34-b248fa923d3f" 00:29:07.880 ], 00:29:07.880 "product_name": "Malloc disk", 00:29:07.880 "block_size": 4128, 00:29:07.880 "num_blocks": 8192, 00:29:07.880 "uuid": "8143e325-45f2-4c10-bf34-b248fa923d3f", 00:29:07.880 "md_size": 32, 00:29:07.880 "md_interleave": true, 00:29:07.880 "dif_type": 0, 00:29:07.880 "assigned_rate_limits": { 00:29:07.880 "rw_ios_per_sec": 0, 00:29:07.880 "rw_mbytes_per_sec": 0, 00:29:07.880 "r_mbytes_per_sec": 0, 00:29:07.880 "w_mbytes_per_sec": 0 00:29:07.880 }, 00:29:07.880 "claimed": true, 00:29:07.880 "claim_type": "exclusive_write", 00:29:07.880 "zoned": false, 00:29:07.880 "supported_io_types": { 00:29:07.880 "read": true, 00:29:07.880 "write": true, 00:29:07.880 "unmap": true, 00:29:07.880 "flush": true, 00:29:07.880 "reset": true, 00:29:07.880 "nvme_admin": false, 00:29:07.880 "nvme_io": false, 00:29:07.880 "nvme_io_md": false, 00:29:07.880 "write_zeroes": true, 00:29:07.880 "zcopy": true, 00:29:07.880 "get_zone_info": false, 00:29:07.880 "zone_management": false, 00:29:07.880 "zone_append": false, 00:29:07.880 "compare": false, 00:29:07.880 "compare_and_write": false, 00:29:07.880 "abort": true, 00:29:07.880 "seek_hole": false, 00:29:07.880 "seek_data": false, 00:29:07.880 "copy": true, 00:29:07.880 "nvme_iov_md": false 00:29:07.880 }, 00:29:07.880 "memory_domains": [ 00:29:07.880 { 00:29:07.880 "dma_device_id": "system", 00:29:07.880 "dma_device_type": 1 00:29:07.880 }, 00:29:07.880 { 00:29:07.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:07.880 "dma_device_type": 2 00:29:07.880 } 00:29:07.880 ], 00:29:07.880 "driver_specific": {} 00:29:07.880 }' 00:29:07.880 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:08.137 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:08.137 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:08.137 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:08.137 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:08.137 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:08.137 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:08.137 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:08.394 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:08.394 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:08.394 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:08.394 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:08.394 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:08.394 [2024-07-15 10:36:45.576183] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:08.652 "name": "Existed_Raid", 00:29:08.652 "uuid": "6f42f529-7335-427c-af03-5ab092d134ef", 00:29:08.652 "strip_size_kb": 0, 00:29:08.652 "state": "online", 00:29:08.652 "raid_level": "raid1", 00:29:08.652 "superblock": true, 00:29:08.652 "num_base_bdevs": 2, 00:29:08.652 "num_base_bdevs_discovered": 1, 00:29:08.652 "num_base_bdevs_operational": 1, 00:29:08.652 "base_bdevs_list": [ 00:29:08.652 { 00:29:08.652 "name": null, 00:29:08.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:08.652 "is_configured": false, 00:29:08.652 "data_offset": 256, 00:29:08.652 "data_size": 7936 00:29:08.652 }, 00:29:08.652 { 00:29:08.652 "name": "BaseBdev2", 00:29:08.652 "uuid": "8143e325-45f2-4c10-bf34-b248fa923d3f", 00:29:08.652 "is_configured": true, 00:29:08.652 "data_offset": 256, 00:29:08.652 "data_size": 7936 00:29:08.652 } 00:29:08.652 ] 00:29:08.652 }' 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:08.652 10:36:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:09.216 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:09.216 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:09.216 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:09.216 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.472 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:09.472 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:09.472 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:09.729 [2024-07-15 10:36:46.732349] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:09.729 [2024-07-15 10:36:46.732439] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:09.730 [2024-07-15 10:36:46.745547] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:09.730 [2024-07-15 10:36:46.745582] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:09.730 [2024-07-15 10:36:46.745594] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2554180 name Existed_Raid, state offline 00:29:09.730 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:09.730 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:09.730 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:09.730 10:36:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 629678 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 629678 ']' 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 629678 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 629678 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 629678' 00:29:09.987 killing process with pid 629678 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 629678 00:29:09.987 [2024-07-15 10:36:47.062386] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:09.987 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 629678 00:29:09.987 [2024-07-15 10:36:47.063288] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:10.245 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:29:10.245 00:29:10.245 real 0m9.998s 00:29:10.245 user 0m17.741s 00:29:10.245 sys 0m1.960s 00:29:10.245 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:10.245 10:36:47 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:10.245 ************************************ 00:29:10.245 END TEST raid_state_function_test_sb_md_interleaved 00:29:10.245 ************************************ 00:29:10.245 10:36:47 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:10.245 10:36:47 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:29:10.245 10:36:47 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:29:10.245 10:36:47 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:10.245 10:36:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:10.245 ************************************ 00:29:10.245 START TEST raid_superblock_test_md_interleaved 00:29:10.245 ************************************ 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=631142 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 631142 /var/tmp/spdk-raid.sock 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 631142 ']' 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:10.245 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:10.245 10:36:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:10.245 [2024-07-15 10:36:47.438636] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:29:10.245 [2024-07-15 10:36:47.438711] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid631142 ] 00:29:10.502 [2024-07-15 10:36:47.568738] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:10.502 [2024-07-15 10:36:47.672697] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:10.759 [2024-07-15 10:36:47.730508] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:10.759 [2024-07-15 10:36:47.730537] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:11.322 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:29:11.579 malloc1 00:29:11.579 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:11.836 [2024-07-15 10:36:48.783255] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:11.836 [2024-07-15 10:36:48.783306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:11.836 [2024-07-15 10:36:48.783327] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17aa4e0 00:29:11.836 [2024-07-15 10:36:48.783340] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:11.836 [2024-07-15 10:36:48.784758] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:11.836 [2024-07-15 10:36:48.784784] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:11.836 pt1 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:11.836 10:36:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:29:12.093 malloc2 00:29:12.093 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:12.093 [2024-07-15 10:36:49.285760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:12.093 [2024-07-15 10:36:49.285812] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:12.093 [2024-07-15 10:36:49.285831] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x178f570 00:29:12.093 [2024-07-15 10:36:49.285844] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:12.093 [2024-07-15 10:36:49.287232] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:12.093 [2024-07-15 10:36:49.287259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:12.093 pt2 00:29:12.350 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:12.350 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:12.350 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:12.350 [2024-07-15 10:36:49.534441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:12.350 [2024-07-15 10:36:49.535864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:12.350 [2024-07-15 10:36:49.536029] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1790f20 00:29:12.350 [2024-07-15 10:36:49.536043] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:12.350 [2024-07-15 10:36:49.536116] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x160d050 00:29:12.350 [2024-07-15 10:36:49.536201] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1790f20 00:29:12.350 [2024-07-15 10:36:49.536211] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1790f20 00:29:12.350 [2024-07-15 10:36:49.536271] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:12.606 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:12.863 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:12.863 "name": "raid_bdev1", 00:29:12.863 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:12.863 "strip_size_kb": 0, 00:29:12.863 "state": "online", 00:29:12.863 "raid_level": "raid1", 00:29:12.863 "superblock": true, 00:29:12.863 "num_base_bdevs": 2, 00:29:12.863 "num_base_bdevs_discovered": 2, 00:29:12.863 "num_base_bdevs_operational": 2, 00:29:12.863 "base_bdevs_list": [ 00:29:12.863 { 00:29:12.863 "name": "pt1", 00:29:12.863 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:12.863 "is_configured": true, 00:29:12.863 "data_offset": 256, 00:29:12.863 "data_size": 7936 00:29:12.863 }, 00:29:12.863 { 00:29:12.863 "name": "pt2", 00:29:12.863 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:12.863 "is_configured": true, 00:29:12.863 "data_offset": 256, 00:29:12.863 "data_size": 7936 00:29:12.863 } 00:29:12.863 ] 00:29:12.863 }' 00:29:12.863 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:12.863 10:36:49 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:13.427 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:13.427 [2024-07-15 10:36:50.613569] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:13.685 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:13.685 "name": "raid_bdev1", 00:29:13.685 "aliases": [ 00:29:13.685 "31e0550b-ab24-4778-a50c-92e51bb9c75d" 00:29:13.685 ], 00:29:13.685 "product_name": "Raid Volume", 00:29:13.685 "block_size": 4128, 00:29:13.685 "num_blocks": 7936, 00:29:13.685 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:13.685 "md_size": 32, 00:29:13.685 "md_interleave": true, 00:29:13.685 "dif_type": 0, 00:29:13.685 "assigned_rate_limits": { 00:29:13.685 "rw_ios_per_sec": 0, 00:29:13.685 "rw_mbytes_per_sec": 0, 00:29:13.685 "r_mbytes_per_sec": 0, 00:29:13.685 "w_mbytes_per_sec": 0 00:29:13.685 }, 00:29:13.685 "claimed": false, 00:29:13.685 "zoned": false, 00:29:13.685 "supported_io_types": { 00:29:13.685 "read": true, 00:29:13.685 "write": true, 00:29:13.685 "unmap": false, 00:29:13.685 "flush": false, 00:29:13.685 "reset": true, 00:29:13.685 "nvme_admin": false, 00:29:13.685 "nvme_io": false, 00:29:13.685 "nvme_io_md": false, 00:29:13.685 "write_zeroes": true, 00:29:13.685 "zcopy": false, 00:29:13.685 "get_zone_info": false, 00:29:13.685 "zone_management": false, 00:29:13.685 "zone_append": false, 00:29:13.685 "compare": false, 00:29:13.685 "compare_and_write": false, 00:29:13.685 "abort": false, 00:29:13.685 "seek_hole": false, 00:29:13.685 "seek_data": false, 00:29:13.685 "copy": false, 00:29:13.685 "nvme_iov_md": false 00:29:13.685 }, 00:29:13.685 "memory_domains": [ 00:29:13.685 { 00:29:13.685 "dma_device_id": "system", 00:29:13.685 "dma_device_type": 1 00:29:13.685 }, 00:29:13.685 { 00:29:13.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:13.685 "dma_device_type": 2 00:29:13.685 }, 00:29:13.685 { 00:29:13.685 "dma_device_id": "system", 00:29:13.685 "dma_device_type": 1 00:29:13.685 }, 00:29:13.685 { 00:29:13.685 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:13.685 "dma_device_type": 2 00:29:13.685 } 00:29:13.685 ], 00:29:13.685 "driver_specific": { 00:29:13.685 "raid": { 00:29:13.685 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:13.685 "strip_size_kb": 0, 00:29:13.685 "state": "online", 00:29:13.685 "raid_level": "raid1", 00:29:13.685 "superblock": true, 00:29:13.685 "num_base_bdevs": 2, 00:29:13.685 "num_base_bdevs_discovered": 2, 00:29:13.685 "num_base_bdevs_operational": 2, 00:29:13.685 "base_bdevs_list": [ 00:29:13.685 { 00:29:13.685 "name": "pt1", 00:29:13.685 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:13.685 "is_configured": true, 00:29:13.685 "data_offset": 256, 00:29:13.685 "data_size": 7936 00:29:13.685 }, 00:29:13.685 { 00:29:13.685 "name": "pt2", 00:29:13.685 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:13.685 "is_configured": true, 00:29:13.685 "data_offset": 256, 00:29:13.685 "data_size": 7936 00:29:13.685 } 00:29:13.685 ] 00:29:13.685 } 00:29:13.685 } 00:29:13.685 }' 00:29:13.685 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:13.685 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:13.685 pt2' 00:29:13.685 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:13.685 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:13.685 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:13.942 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:13.942 "name": "pt1", 00:29:13.942 "aliases": [ 00:29:13.942 "00000000-0000-0000-0000-000000000001" 00:29:13.942 ], 00:29:13.942 "product_name": "passthru", 00:29:13.942 "block_size": 4128, 00:29:13.942 "num_blocks": 8192, 00:29:13.943 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:13.943 "md_size": 32, 00:29:13.943 "md_interleave": true, 00:29:13.943 "dif_type": 0, 00:29:13.943 "assigned_rate_limits": { 00:29:13.943 "rw_ios_per_sec": 0, 00:29:13.943 "rw_mbytes_per_sec": 0, 00:29:13.943 "r_mbytes_per_sec": 0, 00:29:13.943 "w_mbytes_per_sec": 0 00:29:13.943 }, 00:29:13.943 "claimed": true, 00:29:13.943 "claim_type": "exclusive_write", 00:29:13.943 "zoned": false, 00:29:13.943 "supported_io_types": { 00:29:13.943 "read": true, 00:29:13.943 "write": true, 00:29:13.943 "unmap": true, 00:29:13.943 "flush": true, 00:29:13.943 "reset": true, 00:29:13.943 "nvme_admin": false, 00:29:13.943 "nvme_io": false, 00:29:13.943 "nvme_io_md": false, 00:29:13.943 "write_zeroes": true, 00:29:13.943 "zcopy": true, 00:29:13.943 "get_zone_info": false, 00:29:13.943 "zone_management": false, 00:29:13.943 "zone_append": false, 00:29:13.943 "compare": false, 00:29:13.943 "compare_and_write": false, 00:29:13.943 "abort": true, 00:29:13.943 "seek_hole": false, 00:29:13.943 "seek_data": false, 00:29:13.943 "copy": true, 00:29:13.943 "nvme_iov_md": false 00:29:13.943 }, 00:29:13.943 "memory_domains": [ 00:29:13.943 { 00:29:13.943 "dma_device_id": "system", 00:29:13.943 "dma_device_type": 1 00:29:13.943 }, 00:29:13.943 { 00:29:13.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:13.943 "dma_device_type": 2 00:29:13.943 } 00:29:13.943 ], 00:29:13.943 "driver_specific": { 00:29:13.943 "passthru": { 00:29:13.943 "name": "pt1", 00:29:13.943 "base_bdev_name": "malloc1" 00:29:13.943 } 00:29:13.943 } 00:29:13.943 }' 00:29:13.943 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.943 10:36:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:13.943 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:13.943 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:13.943 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:13.943 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:13.943 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:14.200 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:14.458 "name": "pt2", 00:29:14.458 "aliases": [ 00:29:14.458 "00000000-0000-0000-0000-000000000002" 00:29:14.458 ], 00:29:14.458 "product_name": "passthru", 00:29:14.458 "block_size": 4128, 00:29:14.458 "num_blocks": 8192, 00:29:14.458 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:14.458 "md_size": 32, 00:29:14.458 "md_interleave": true, 00:29:14.458 "dif_type": 0, 00:29:14.458 "assigned_rate_limits": { 00:29:14.458 "rw_ios_per_sec": 0, 00:29:14.458 "rw_mbytes_per_sec": 0, 00:29:14.458 "r_mbytes_per_sec": 0, 00:29:14.458 "w_mbytes_per_sec": 0 00:29:14.458 }, 00:29:14.458 "claimed": true, 00:29:14.458 "claim_type": "exclusive_write", 00:29:14.458 "zoned": false, 00:29:14.458 "supported_io_types": { 00:29:14.458 "read": true, 00:29:14.458 "write": true, 00:29:14.458 "unmap": true, 00:29:14.458 "flush": true, 00:29:14.458 "reset": true, 00:29:14.458 "nvme_admin": false, 00:29:14.458 "nvme_io": false, 00:29:14.458 "nvme_io_md": false, 00:29:14.458 "write_zeroes": true, 00:29:14.458 "zcopy": true, 00:29:14.458 "get_zone_info": false, 00:29:14.458 "zone_management": false, 00:29:14.458 "zone_append": false, 00:29:14.458 "compare": false, 00:29:14.458 "compare_and_write": false, 00:29:14.458 "abort": true, 00:29:14.458 "seek_hole": false, 00:29:14.458 "seek_data": false, 00:29:14.458 "copy": true, 00:29:14.458 "nvme_iov_md": false 00:29:14.458 }, 00:29:14.458 "memory_domains": [ 00:29:14.458 { 00:29:14.458 "dma_device_id": "system", 00:29:14.458 "dma_device_type": 1 00:29:14.458 }, 00:29:14.458 { 00:29:14.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:14.458 "dma_device_type": 2 00:29:14.458 } 00:29:14.458 ], 00:29:14.458 "driver_specific": { 00:29:14.458 "passthru": { 00:29:14.458 "name": "pt2", 00:29:14.458 "base_bdev_name": "malloc2" 00:29:14.458 } 00:29:14.458 } 00:29:14.458 }' 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:14.458 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.715 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:14.715 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:14.715 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.715 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:14.715 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:14.715 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:14.715 10:36:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:14.973 [2024-07-15 10:36:52.017302] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:14.973 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=31e0550b-ab24-4778-a50c-92e51bb9c75d 00:29:14.973 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 31e0550b-ab24-4778-a50c-92e51bb9c75d ']' 00:29:14.973 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:15.230 [2024-07-15 10:36:52.261653] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:15.230 [2024-07-15 10:36:52.261679] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:15.230 [2024-07-15 10:36:52.261743] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:15.230 [2024-07-15 10:36:52.261799] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:15.230 [2024-07-15 10:36:52.261812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1790f20 name raid_bdev1, state offline 00:29:15.230 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:15.230 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:15.487 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:15.487 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:15.487 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:15.488 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:15.745 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:15.745 10:36:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:16.002 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:16.002 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:16.258 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:16.258 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:16.258 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:16.258 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:16.259 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:16.515 [2024-07-15 10:36:53.484839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:16.515 [2024-07-15 10:36:53.486248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:16.515 [2024-07-15 10:36:53.486305] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:16.515 [2024-07-15 10:36:53.486346] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:16.515 [2024-07-15 10:36:53.486365] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:16.515 [2024-07-15 10:36:53.486376] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x179b260 name raid_bdev1, state configuring 00:29:16.515 request: 00:29:16.516 { 00:29:16.516 "name": "raid_bdev1", 00:29:16.516 "raid_level": "raid1", 00:29:16.516 "base_bdevs": [ 00:29:16.516 "malloc1", 00:29:16.516 "malloc2" 00:29:16.516 ], 00:29:16.516 "superblock": false, 00:29:16.516 "method": "bdev_raid_create", 00:29:16.516 "req_id": 1 00:29:16.516 } 00:29:16.516 Got JSON-RPC error response 00:29:16.516 response: 00:29:16.516 { 00:29:16.516 "code": -17, 00:29:16.516 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:16.516 } 00:29:16.516 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:16.516 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:16.516 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:16.516 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:16.516 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.516 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:16.772 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:16.772 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:16.772 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:16.772 [2024-07-15 10:36:53.970068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:16.772 [2024-07-15 10:36:53.970123] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:16.772 [2024-07-15 10:36:53.970144] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1792000 00:29:16.772 [2024-07-15 10:36:53.970156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.030 [2024-07-15 10:36:53.971627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.030 [2024-07-15 10:36:53.971656] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:17.030 [2024-07-15 10:36:53.971706] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:17.030 [2024-07-15 10:36:53.971735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:17.030 pt1 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.030 10:36:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.287 10:36:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.287 "name": "raid_bdev1", 00:29:17.287 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:17.287 "strip_size_kb": 0, 00:29:17.287 "state": "configuring", 00:29:17.287 "raid_level": "raid1", 00:29:17.287 "superblock": true, 00:29:17.287 "num_base_bdevs": 2, 00:29:17.287 "num_base_bdevs_discovered": 1, 00:29:17.287 "num_base_bdevs_operational": 2, 00:29:17.287 "base_bdevs_list": [ 00:29:17.287 { 00:29:17.287 "name": "pt1", 00:29:17.287 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:17.287 "is_configured": true, 00:29:17.287 "data_offset": 256, 00:29:17.287 "data_size": 7936 00:29:17.287 }, 00:29:17.287 { 00:29:17.287 "name": null, 00:29:17.287 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:17.287 "is_configured": false, 00:29:17.287 "data_offset": 256, 00:29:17.287 "data_size": 7936 00:29:17.287 } 00:29:17.287 ] 00:29:17.287 }' 00:29:17.287 10:36:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.287 10:36:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:17.883 10:36:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:17.883 10:36:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:17.883 10:36:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:17.883 10:36:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:17.883 [2024-07-15 10:36:55.048946] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:17.883 [2024-07-15 10:36:55.049005] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:17.883 [2024-07-15 10:36:55.049030] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1794270 00:29:17.883 [2024-07-15 10:36:55.049044] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:17.883 [2024-07-15 10:36:55.049231] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:17.883 [2024-07-15 10:36:55.049247] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:17.883 [2024-07-15 10:36:55.049297] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:17.883 [2024-07-15 10:36:55.049316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:17.883 [2024-07-15 10:36:55.049398] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x160dc10 00:29:17.883 [2024-07-15 10:36:55.049408] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:17.883 [2024-07-15 10:36:55.049464] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x178fd40 00:29:17.883 [2024-07-15 10:36:55.049542] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x160dc10 00:29:17.883 [2024-07-15 10:36:55.049552] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x160dc10 00:29:17.883 [2024-07-15 10:36:55.049610] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:18.190 pt2 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:18.190 "name": "raid_bdev1", 00:29:18.190 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:18.190 "strip_size_kb": 0, 00:29:18.190 "state": "online", 00:29:18.190 "raid_level": "raid1", 00:29:18.190 "superblock": true, 00:29:18.190 "num_base_bdevs": 2, 00:29:18.190 "num_base_bdevs_discovered": 2, 00:29:18.190 "num_base_bdevs_operational": 2, 00:29:18.190 "base_bdevs_list": [ 00:29:18.190 { 00:29:18.190 "name": "pt1", 00:29:18.190 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:18.190 "is_configured": true, 00:29:18.190 "data_offset": 256, 00:29:18.190 "data_size": 7936 00:29:18.190 }, 00:29:18.190 { 00:29:18.190 "name": "pt2", 00:29:18.190 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:18.190 "is_configured": true, 00:29:18.190 "data_offset": 256, 00:29:18.190 "data_size": 7936 00:29:18.190 } 00:29:18.190 ] 00:29:18.190 }' 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:18.190 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:18.754 10:36:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:19.012 [2024-07-15 10:36:56.156151] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:19.012 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:19.012 "name": "raid_bdev1", 00:29:19.012 "aliases": [ 00:29:19.012 "31e0550b-ab24-4778-a50c-92e51bb9c75d" 00:29:19.012 ], 00:29:19.012 "product_name": "Raid Volume", 00:29:19.012 "block_size": 4128, 00:29:19.012 "num_blocks": 7936, 00:29:19.012 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:19.012 "md_size": 32, 00:29:19.012 "md_interleave": true, 00:29:19.012 "dif_type": 0, 00:29:19.012 "assigned_rate_limits": { 00:29:19.012 "rw_ios_per_sec": 0, 00:29:19.012 "rw_mbytes_per_sec": 0, 00:29:19.012 "r_mbytes_per_sec": 0, 00:29:19.012 "w_mbytes_per_sec": 0 00:29:19.012 }, 00:29:19.012 "claimed": false, 00:29:19.012 "zoned": false, 00:29:19.012 "supported_io_types": { 00:29:19.012 "read": true, 00:29:19.012 "write": true, 00:29:19.012 "unmap": false, 00:29:19.012 "flush": false, 00:29:19.012 "reset": true, 00:29:19.012 "nvme_admin": false, 00:29:19.012 "nvme_io": false, 00:29:19.012 "nvme_io_md": false, 00:29:19.012 "write_zeroes": true, 00:29:19.012 "zcopy": false, 00:29:19.012 "get_zone_info": false, 00:29:19.012 "zone_management": false, 00:29:19.012 "zone_append": false, 00:29:19.012 "compare": false, 00:29:19.012 "compare_and_write": false, 00:29:19.012 "abort": false, 00:29:19.012 "seek_hole": false, 00:29:19.012 "seek_data": false, 00:29:19.012 "copy": false, 00:29:19.012 "nvme_iov_md": false 00:29:19.012 }, 00:29:19.012 "memory_domains": [ 00:29:19.012 { 00:29:19.012 "dma_device_id": "system", 00:29:19.012 "dma_device_type": 1 00:29:19.012 }, 00:29:19.012 { 00:29:19.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:19.012 "dma_device_type": 2 00:29:19.012 }, 00:29:19.012 { 00:29:19.012 "dma_device_id": "system", 00:29:19.012 "dma_device_type": 1 00:29:19.012 }, 00:29:19.012 { 00:29:19.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:19.012 "dma_device_type": 2 00:29:19.012 } 00:29:19.012 ], 00:29:19.012 "driver_specific": { 00:29:19.012 "raid": { 00:29:19.012 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:19.012 "strip_size_kb": 0, 00:29:19.012 "state": "online", 00:29:19.012 "raid_level": "raid1", 00:29:19.012 "superblock": true, 00:29:19.012 "num_base_bdevs": 2, 00:29:19.012 "num_base_bdevs_discovered": 2, 00:29:19.012 "num_base_bdevs_operational": 2, 00:29:19.012 "base_bdevs_list": [ 00:29:19.012 { 00:29:19.012 "name": "pt1", 00:29:19.012 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:19.012 "is_configured": true, 00:29:19.012 "data_offset": 256, 00:29:19.012 "data_size": 7936 00:29:19.012 }, 00:29:19.012 { 00:29:19.012 "name": "pt2", 00:29:19.012 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:19.012 "is_configured": true, 00:29:19.012 "data_offset": 256, 00:29:19.012 "data_size": 7936 00:29:19.012 } 00:29:19.012 ] 00:29:19.012 } 00:29:19.012 } 00:29:19.012 }' 00:29:19.012 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:19.269 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:19.269 pt2' 00:29:19.269 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:19.269 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:19.269 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:19.269 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:19.269 "name": "pt1", 00:29:19.269 "aliases": [ 00:29:19.269 "00000000-0000-0000-0000-000000000001" 00:29:19.269 ], 00:29:19.269 "product_name": "passthru", 00:29:19.269 "block_size": 4128, 00:29:19.269 "num_blocks": 8192, 00:29:19.269 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:19.269 "md_size": 32, 00:29:19.269 "md_interleave": true, 00:29:19.269 "dif_type": 0, 00:29:19.269 "assigned_rate_limits": { 00:29:19.269 "rw_ios_per_sec": 0, 00:29:19.269 "rw_mbytes_per_sec": 0, 00:29:19.269 "r_mbytes_per_sec": 0, 00:29:19.269 "w_mbytes_per_sec": 0 00:29:19.269 }, 00:29:19.269 "claimed": true, 00:29:19.269 "claim_type": "exclusive_write", 00:29:19.269 "zoned": false, 00:29:19.269 "supported_io_types": { 00:29:19.269 "read": true, 00:29:19.269 "write": true, 00:29:19.269 "unmap": true, 00:29:19.269 "flush": true, 00:29:19.269 "reset": true, 00:29:19.269 "nvme_admin": false, 00:29:19.269 "nvme_io": false, 00:29:19.269 "nvme_io_md": false, 00:29:19.269 "write_zeroes": true, 00:29:19.269 "zcopy": true, 00:29:19.269 "get_zone_info": false, 00:29:19.269 "zone_management": false, 00:29:19.269 "zone_append": false, 00:29:19.269 "compare": false, 00:29:19.269 "compare_and_write": false, 00:29:19.269 "abort": true, 00:29:19.269 "seek_hole": false, 00:29:19.269 "seek_data": false, 00:29:19.269 "copy": true, 00:29:19.269 "nvme_iov_md": false 00:29:19.269 }, 00:29:19.269 "memory_domains": [ 00:29:19.269 { 00:29:19.269 "dma_device_id": "system", 00:29:19.269 "dma_device_type": 1 00:29:19.269 }, 00:29:19.270 { 00:29:19.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:19.270 "dma_device_type": 2 00:29:19.270 } 00:29:19.270 ], 00:29:19.270 "driver_specific": { 00:29:19.270 "passthru": { 00:29:19.270 "name": "pt1", 00:29:19.270 "base_bdev_name": "malloc1" 00:29:19.270 } 00:29:19.270 } 00:29:19.270 }' 00:29:19.270 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:19.527 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:19.527 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:19.527 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:19.527 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:19.527 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:19.527 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.527 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:19.784 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:19.784 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.784 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:19.784 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:19.784 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:19.784 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:19.784 10:36:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:20.041 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:20.041 "name": "pt2", 00:29:20.041 "aliases": [ 00:29:20.041 "00000000-0000-0000-0000-000000000002" 00:29:20.041 ], 00:29:20.041 "product_name": "passthru", 00:29:20.041 "block_size": 4128, 00:29:20.041 "num_blocks": 8192, 00:29:20.041 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:20.041 "md_size": 32, 00:29:20.041 "md_interleave": true, 00:29:20.041 "dif_type": 0, 00:29:20.041 "assigned_rate_limits": { 00:29:20.041 "rw_ios_per_sec": 0, 00:29:20.041 "rw_mbytes_per_sec": 0, 00:29:20.041 "r_mbytes_per_sec": 0, 00:29:20.041 "w_mbytes_per_sec": 0 00:29:20.041 }, 00:29:20.041 "claimed": true, 00:29:20.041 "claim_type": "exclusive_write", 00:29:20.041 "zoned": false, 00:29:20.041 "supported_io_types": { 00:29:20.041 "read": true, 00:29:20.041 "write": true, 00:29:20.041 "unmap": true, 00:29:20.041 "flush": true, 00:29:20.041 "reset": true, 00:29:20.041 "nvme_admin": false, 00:29:20.041 "nvme_io": false, 00:29:20.041 "nvme_io_md": false, 00:29:20.041 "write_zeroes": true, 00:29:20.041 "zcopy": true, 00:29:20.041 "get_zone_info": false, 00:29:20.041 "zone_management": false, 00:29:20.041 "zone_append": false, 00:29:20.041 "compare": false, 00:29:20.041 "compare_and_write": false, 00:29:20.041 "abort": true, 00:29:20.041 "seek_hole": false, 00:29:20.041 "seek_data": false, 00:29:20.041 "copy": true, 00:29:20.041 "nvme_iov_md": false 00:29:20.041 }, 00:29:20.041 "memory_domains": [ 00:29:20.041 { 00:29:20.041 "dma_device_id": "system", 00:29:20.041 "dma_device_type": 1 00:29:20.041 }, 00:29:20.041 { 00:29:20.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:20.041 "dma_device_type": 2 00:29:20.041 } 00:29:20.041 ], 00:29:20.041 "driver_specific": { 00:29:20.041 "passthru": { 00:29:20.041 "name": "pt2", 00:29:20.041 "base_bdev_name": "malloc2" 00:29:20.041 } 00:29:20.041 } 00:29:20.041 }' 00:29:20.041 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:20.041 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:20.041 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:29:20.041 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:20.041 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:20.299 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:20.556 [2024-07-15 10:36:57.664153] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:20.556 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 31e0550b-ab24-4778-a50c-92e51bb9c75d '!=' 31e0550b-ab24-4778-a50c-92e51bb9c75d ']' 00:29:20.556 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:20.556 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:20.556 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:29:20.556 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:20.814 [2024-07-15 10:36:57.908530] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.814 10:36:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:21.071 10:36:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:21.071 "name": "raid_bdev1", 00:29:21.071 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:21.071 "strip_size_kb": 0, 00:29:21.071 "state": "online", 00:29:21.071 "raid_level": "raid1", 00:29:21.071 "superblock": true, 00:29:21.071 "num_base_bdevs": 2, 00:29:21.071 "num_base_bdevs_discovered": 1, 00:29:21.071 "num_base_bdevs_operational": 1, 00:29:21.071 "base_bdevs_list": [ 00:29:21.071 { 00:29:21.071 "name": null, 00:29:21.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:21.071 "is_configured": false, 00:29:21.071 "data_offset": 256, 00:29:21.071 "data_size": 7936 00:29:21.071 }, 00:29:21.071 { 00:29:21.071 "name": "pt2", 00:29:21.071 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:21.071 "is_configured": true, 00:29:21.071 "data_offset": 256, 00:29:21.071 "data_size": 7936 00:29:21.071 } 00:29:21.071 ] 00:29:21.071 }' 00:29:21.071 10:36:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:21.071 10:36:58 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:21.636 10:36:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:21.893 [2024-07-15 10:36:58.995372] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:21.893 [2024-07-15 10:36:58.995403] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:21.893 [2024-07-15 10:36:58.995468] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:21.893 [2024-07-15 10:36:58.995515] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:21.893 [2024-07-15 10:36:58.995527] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x160dc10 name raid_bdev1, state offline 00:29:21.893 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:21.893 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.150 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:22.150 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:22.150 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:22.150 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:22.150 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:22.407 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:22.407 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:22.407 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:22.407 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:22.407 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:29:22.407 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:22.665 [2024-07-15 10:36:59.737303] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:22.665 [2024-07-15 10:36:59.737358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:22.665 [2024-07-15 10:36:59.737377] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17929f0 00:29:22.665 [2024-07-15 10:36:59.737390] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:22.665 [2024-07-15 10:36:59.738822] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:22.665 [2024-07-15 10:36:59.738848] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:22.665 [2024-07-15 10:36:59.738896] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:22.665 [2024-07-15 10:36:59.738935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:22.665 [2024-07-15 10:36:59.739006] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1793ea0 00:29:22.665 [2024-07-15 10:36:59.739016] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:22.665 [2024-07-15 10:36:59.739075] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1791bc0 00:29:22.665 [2024-07-15 10:36:59.739149] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1793ea0 00:29:22.665 [2024-07-15 10:36:59.739159] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1793ea0 00:29:22.665 [2024-07-15 10:36:59.739215] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:22.665 pt2 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:22.665 10:36:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:22.923 10:37:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:22.923 "name": "raid_bdev1", 00:29:22.923 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:22.923 "strip_size_kb": 0, 00:29:22.923 "state": "online", 00:29:22.923 "raid_level": "raid1", 00:29:22.923 "superblock": true, 00:29:22.923 "num_base_bdevs": 2, 00:29:22.923 "num_base_bdevs_discovered": 1, 00:29:22.923 "num_base_bdevs_operational": 1, 00:29:22.923 "base_bdevs_list": [ 00:29:22.923 { 00:29:22.923 "name": null, 00:29:22.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:22.923 "is_configured": false, 00:29:22.923 "data_offset": 256, 00:29:22.923 "data_size": 7936 00:29:22.923 }, 00:29:22.923 { 00:29:22.923 "name": "pt2", 00:29:22.923 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:22.923 "is_configured": true, 00:29:22.923 "data_offset": 256, 00:29:22.923 "data_size": 7936 00:29:22.923 } 00:29:22.923 ] 00:29:22.923 }' 00:29:22.923 10:37:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:22.923 10:37:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:23.487 10:37:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:23.745 [2024-07-15 10:37:00.756015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:23.745 [2024-07-15 10:37:00.756044] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:23.745 [2024-07-15 10:37:00.756105] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:23.745 [2024-07-15 10:37:00.756151] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:23.745 [2024-07-15 10:37:00.756163] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1793ea0 name raid_bdev1, state offline 00:29:23.745 10:37:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.745 10:37:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:24.002 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:24.002 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:24.002 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:24.002 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:24.259 [2024-07-15 10:37:01.249298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:24.259 [2024-07-15 10:37:01.249347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:24.259 [2024-07-15 10:37:01.249368] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1792620 00:29:24.260 [2024-07-15 10:37:01.249380] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:24.260 [2024-07-15 10:37:01.250809] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:24.260 [2024-07-15 10:37:01.250834] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:24.260 [2024-07-15 10:37:01.250878] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:24.260 [2024-07-15 10:37:01.250903] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:24.260 [2024-07-15 10:37:01.250989] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:24.260 [2024-07-15 10:37:01.251003] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:24.260 [2024-07-15 10:37:01.251018] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1794640 name raid_bdev1, state configuring 00:29:24.260 [2024-07-15 10:37:01.251040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:24.260 [2024-07-15 10:37:01.251094] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1794640 00:29:24.260 [2024-07-15 10:37:01.251104] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:24.260 [2024-07-15 10:37:01.251154] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1793810 00:29:24.260 [2024-07-15 10:37:01.251225] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1794640 00:29:24.260 [2024-07-15 10:37:01.251235] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1794640 00:29:24.260 [2024-07-15 10:37:01.251293] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:24.260 pt1 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.260 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:24.517 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:24.517 "name": "raid_bdev1", 00:29:24.517 "uuid": "31e0550b-ab24-4778-a50c-92e51bb9c75d", 00:29:24.517 "strip_size_kb": 0, 00:29:24.517 "state": "online", 00:29:24.517 "raid_level": "raid1", 00:29:24.517 "superblock": true, 00:29:24.517 "num_base_bdevs": 2, 00:29:24.517 "num_base_bdevs_discovered": 1, 00:29:24.517 "num_base_bdevs_operational": 1, 00:29:24.517 "base_bdevs_list": [ 00:29:24.517 { 00:29:24.517 "name": null, 00:29:24.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:24.517 "is_configured": false, 00:29:24.517 "data_offset": 256, 00:29:24.517 "data_size": 7936 00:29:24.517 }, 00:29:24.517 { 00:29:24.517 "name": "pt2", 00:29:24.517 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:24.517 "is_configured": true, 00:29:24.517 "data_offset": 256, 00:29:24.517 "data_size": 7936 00:29:24.517 } 00:29:24.517 ] 00:29:24.517 }' 00:29:24.517 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:24.517 10:37:01 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:25.078 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:25.078 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:25.334 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:25.334 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:25.334 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:25.590 [2024-07-15 10:37:02.532933] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 31e0550b-ab24-4778-a50c-92e51bb9c75d '!=' 31e0550b-ab24-4778-a50c-92e51bb9c75d ']' 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 631142 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 631142 ']' 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 631142 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 631142 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 631142' 00:29:25.590 killing process with pid 631142 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 631142 00:29:25.590 [2024-07-15 10:37:02.605760] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:25.590 [2024-07-15 10:37:02.605824] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:25.590 [2024-07-15 10:37:02.605874] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:25.590 [2024-07-15 10:37:02.605887] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1794640 name raid_bdev1, state offline 00:29:25.590 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 631142 00:29:25.590 [2024-07-15 10:37:02.624102] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:25.846 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:29:25.846 00:29:25.846 real 0m15.473s 00:29:25.846 user 0m28.034s 00:29:25.846 sys 0m2.876s 00:29:25.846 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:25.846 10:37:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:25.846 ************************************ 00:29:25.846 END TEST raid_superblock_test_md_interleaved 00:29:25.846 ************************************ 00:29:25.846 10:37:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:25.846 10:37:02 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:29:25.846 10:37:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:25.846 10:37:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:25.846 10:37:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:25.846 ************************************ 00:29:25.846 START TEST raid_rebuild_test_sb_md_interleaved 00:29:25.846 ************************************ 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:25.846 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=633402 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 633402 /var/tmp/spdk-raid.sock 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 633402 ']' 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:25.847 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:25.847 10:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:25.847 [2024-07-15 10:37:03.003363] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:29:25.847 [2024-07-15 10:37:03.003430] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid633402 ] 00:29:25.847 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:25.847 Zero copy mechanism will not be used. 00:29:26.103 [2024-07-15 10:37:03.132319] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:26.103 [2024-07-15 10:37:03.240248] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:26.359 [2024-07-15 10:37:03.309089] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:26.359 [2024-07-15 10:37:03.309124] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:26.923 10:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:26.923 10:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:29:26.923 10:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:26.923 10:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:29:26.923 BaseBdev1_malloc 00:29:27.179 10:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:27.179 [2024-07-15 10:37:04.354854] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:27.179 [2024-07-15 10:37:04.354909] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:27.179 [2024-07-15 10:37:04.354943] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116fce0 00:29:27.179 [2024-07-15 10:37:04.354957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:27.179 [2024-07-15 10:37:04.356587] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:27.179 [2024-07-15 10:37:04.356616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:27.179 BaseBdev1 00:29:27.179 10:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:27.179 10:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:29:27.436 BaseBdev2_malloc 00:29:27.692 10:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:27.692 [2024-07-15 10:37:04.865437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:27.692 [2024-07-15 10:37:04.865490] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:27.692 [2024-07-15 10:37:04.865521] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11672d0 00:29:27.692 [2024-07-15 10:37:04.865534] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:27.692 [2024-07-15 10:37:04.867341] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:27.692 [2024-07-15 10:37:04.867370] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:27.692 BaseBdev2 00:29:27.692 10:37:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:29:27.949 spare_malloc 00:29:27.949 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:28.206 spare_delay 00:29:28.206 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:28.463 [2024-07-15 10:37:05.557439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:28.463 [2024-07-15 10:37:05.557489] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:28.463 [2024-07-15 10:37:05.557512] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116a070 00:29:28.463 [2024-07-15 10:37:05.557525] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:28.463 [2024-07-15 10:37:05.558941] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:28.463 [2024-07-15 10:37:05.558968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:28.463 spare 00:29:28.463 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:28.720 [2024-07-15 10:37:05.790093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:28.721 [2024-07-15 10:37:05.791442] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:28.721 [2024-07-15 10:37:05.791611] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x116c370 00:29:28.721 [2024-07-15 10:37:05.791624] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:28.721 [2024-07-15 10:37:05.791703] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfd29c0 00:29:28.721 [2024-07-15 10:37:05.791791] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116c370 00:29:28.721 [2024-07-15 10:37:05.791801] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116c370 00:29:28.721 [2024-07-15 10:37:05.791862] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:28.721 10:37:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:28.978 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:28.978 "name": "raid_bdev1", 00:29:28.978 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:28.978 "strip_size_kb": 0, 00:29:28.978 "state": "online", 00:29:28.978 "raid_level": "raid1", 00:29:28.978 "superblock": true, 00:29:28.978 "num_base_bdevs": 2, 00:29:28.978 "num_base_bdevs_discovered": 2, 00:29:28.978 "num_base_bdevs_operational": 2, 00:29:28.978 "base_bdevs_list": [ 00:29:28.978 { 00:29:28.978 "name": "BaseBdev1", 00:29:28.978 "uuid": "fe21b550-8bbf-5f95-bb86-13a7bc4f563a", 00:29:28.978 "is_configured": true, 00:29:28.978 "data_offset": 256, 00:29:28.978 "data_size": 7936 00:29:28.978 }, 00:29:28.978 { 00:29:28.978 "name": "BaseBdev2", 00:29:28.978 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:28.978 "is_configured": true, 00:29:28.978 "data_offset": 256, 00:29:28.978 "data_size": 7936 00:29:28.978 } 00:29:28.978 ] 00:29:28.978 }' 00:29:28.978 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:28.978 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:29.542 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:29.542 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:29.799 [2024-07-15 10:37:06.885240] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:29.799 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:29.799 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.799 10:37:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:30.056 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:30.056 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:30.056 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:29:30.056 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:30.314 [2024-07-15 10:37:07.382289] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.314 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:30.572 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:30.572 "name": "raid_bdev1", 00:29:30.572 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:30.573 "strip_size_kb": 0, 00:29:30.573 "state": "online", 00:29:30.573 "raid_level": "raid1", 00:29:30.573 "superblock": true, 00:29:30.573 "num_base_bdevs": 2, 00:29:30.573 "num_base_bdevs_discovered": 1, 00:29:30.573 "num_base_bdevs_operational": 1, 00:29:30.573 "base_bdevs_list": [ 00:29:30.573 { 00:29:30.573 "name": null, 00:29:30.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:30.573 "is_configured": false, 00:29:30.573 "data_offset": 256, 00:29:30.573 "data_size": 7936 00:29:30.573 }, 00:29:30.573 { 00:29:30.573 "name": "BaseBdev2", 00:29:30.573 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:30.573 "is_configured": true, 00:29:30.573 "data_offset": 256, 00:29:30.573 "data_size": 7936 00:29:30.573 } 00:29:30.573 ] 00:29:30.573 }' 00:29:30.573 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:30.573 10:37:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:31.138 10:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:31.395 [2024-07-15 10:37:08.465161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:31.395 [2024-07-15 10:37:08.468776] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x116c250 00:29:31.395 [2024-07-15 10:37:08.470779] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:31.395 10:37:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:32.326 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:32.326 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:32.326 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:32.326 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:32.326 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:32.326 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:32.326 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:32.583 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:32.583 "name": "raid_bdev1", 00:29:32.583 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:32.583 "strip_size_kb": 0, 00:29:32.583 "state": "online", 00:29:32.583 "raid_level": "raid1", 00:29:32.583 "superblock": true, 00:29:32.583 "num_base_bdevs": 2, 00:29:32.583 "num_base_bdevs_discovered": 2, 00:29:32.583 "num_base_bdevs_operational": 2, 00:29:32.583 "process": { 00:29:32.583 "type": "rebuild", 00:29:32.583 "target": "spare", 00:29:32.583 "progress": { 00:29:32.583 "blocks": 3072, 00:29:32.583 "percent": 38 00:29:32.583 } 00:29:32.583 }, 00:29:32.583 "base_bdevs_list": [ 00:29:32.583 { 00:29:32.583 "name": "spare", 00:29:32.583 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:32.584 "is_configured": true, 00:29:32.584 "data_offset": 256, 00:29:32.584 "data_size": 7936 00:29:32.584 }, 00:29:32.584 { 00:29:32.584 "name": "BaseBdev2", 00:29:32.584 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:32.584 "is_configured": true, 00:29:32.584 "data_offset": 256, 00:29:32.584 "data_size": 7936 00:29:32.584 } 00:29:32.584 ] 00:29:32.584 }' 00:29:32.584 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:32.841 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:32.841 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:32.841 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:32.841 10:37:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:33.099 [2024-07-15 10:37:10.056143] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:33.099 [2024-07-15 10:37:10.083341] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:33.099 [2024-07-15 10:37:10.083387] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:33.099 [2024-07-15 10:37:10.083403] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:33.099 [2024-07-15 10:37:10.083412] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.099 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:33.357 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:33.357 "name": "raid_bdev1", 00:29:33.357 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:33.357 "strip_size_kb": 0, 00:29:33.357 "state": "online", 00:29:33.357 "raid_level": "raid1", 00:29:33.357 "superblock": true, 00:29:33.357 "num_base_bdevs": 2, 00:29:33.357 "num_base_bdevs_discovered": 1, 00:29:33.357 "num_base_bdevs_operational": 1, 00:29:33.357 "base_bdevs_list": [ 00:29:33.357 { 00:29:33.357 "name": null, 00:29:33.357 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:33.357 "is_configured": false, 00:29:33.357 "data_offset": 256, 00:29:33.357 "data_size": 7936 00:29:33.357 }, 00:29:33.357 { 00:29:33.357 "name": "BaseBdev2", 00:29:33.357 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:33.357 "is_configured": true, 00:29:33.357 "data_offset": 256, 00:29:33.357 "data_size": 7936 00:29:33.357 } 00:29:33.357 ] 00:29:33.357 }' 00:29:33.357 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:33.357 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:33.920 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:33.920 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:33.920 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:33.920 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:33.920 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:33.920 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.920 10:37:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:34.177 10:37:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:34.177 "name": "raid_bdev1", 00:29:34.177 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:34.177 "strip_size_kb": 0, 00:29:34.177 "state": "online", 00:29:34.177 "raid_level": "raid1", 00:29:34.177 "superblock": true, 00:29:34.177 "num_base_bdevs": 2, 00:29:34.177 "num_base_bdevs_discovered": 1, 00:29:34.177 "num_base_bdevs_operational": 1, 00:29:34.177 "base_bdevs_list": [ 00:29:34.177 { 00:29:34.177 "name": null, 00:29:34.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:34.177 "is_configured": false, 00:29:34.177 "data_offset": 256, 00:29:34.177 "data_size": 7936 00:29:34.177 }, 00:29:34.177 { 00:29:34.177 "name": "BaseBdev2", 00:29:34.177 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:34.177 "is_configured": true, 00:29:34.177 "data_offset": 256, 00:29:34.177 "data_size": 7936 00:29:34.177 } 00:29:34.177 ] 00:29:34.177 }' 00:29:34.177 10:37:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:34.177 10:37:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:34.177 10:37:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:34.177 10:37:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:34.177 10:37:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:34.500 [2024-07-15 10:37:11.531174] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:34.500 [2024-07-15 10:37:11.534807] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1168270 00:29:34.500 [2024-07-15 10:37:11.536250] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:34.500 10:37:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:29:35.434 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:35.434 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:35.434 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:35.434 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:35.434 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:35.434 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.434 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:35.692 "name": "raid_bdev1", 00:29:35.692 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:35.692 "strip_size_kb": 0, 00:29:35.692 "state": "online", 00:29:35.692 "raid_level": "raid1", 00:29:35.692 "superblock": true, 00:29:35.692 "num_base_bdevs": 2, 00:29:35.692 "num_base_bdevs_discovered": 2, 00:29:35.692 "num_base_bdevs_operational": 2, 00:29:35.692 "process": { 00:29:35.692 "type": "rebuild", 00:29:35.692 "target": "spare", 00:29:35.692 "progress": { 00:29:35.692 "blocks": 3072, 00:29:35.692 "percent": 38 00:29:35.692 } 00:29:35.692 }, 00:29:35.692 "base_bdevs_list": [ 00:29:35.692 { 00:29:35.692 "name": "spare", 00:29:35.692 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:35.692 "is_configured": true, 00:29:35.692 "data_offset": 256, 00:29:35.692 "data_size": 7936 00:29:35.692 }, 00:29:35.692 { 00:29:35.692 "name": "BaseBdev2", 00:29:35.692 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:35.692 "is_configured": true, 00:29:35.692 "data_offset": 256, 00:29:35.692 "data_size": 7936 00:29:35.692 } 00:29:35.692 ] 00:29:35.692 }' 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:35.692 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1117 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:35.692 10:37:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:36.254 10:37:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:36.254 "name": "raid_bdev1", 00:29:36.254 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:36.254 "strip_size_kb": 0, 00:29:36.254 "state": "online", 00:29:36.254 "raid_level": "raid1", 00:29:36.254 "superblock": true, 00:29:36.254 "num_base_bdevs": 2, 00:29:36.254 "num_base_bdevs_discovered": 2, 00:29:36.254 "num_base_bdevs_operational": 2, 00:29:36.254 "process": { 00:29:36.254 "type": "rebuild", 00:29:36.254 "target": "spare", 00:29:36.254 "progress": { 00:29:36.254 "blocks": 4608, 00:29:36.254 "percent": 58 00:29:36.254 } 00:29:36.254 }, 00:29:36.254 "base_bdevs_list": [ 00:29:36.254 { 00:29:36.254 "name": "spare", 00:29:36.254 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:36.254 "is_configured": true, 00:29:36.254 "data_offset": 256, 00:29:36.254 "data_size": 7936 00:29:36.254 }, 00:29:36.254 { 00:29:36.254 "name": "BaseBdev2", 00:29:36.254 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:36.254 "is_configured": true, 00:29:36.254 "data_offset": 256, 00:29:36.254 "data_size": 7936 00:29:36.254 } 00:29:36.254 ] 00:29:36.254 }' 00:29:36.254 10:37:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:36.254 10:37:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:36.254 10:37:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:36.510 10:37:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:36.510 10:37:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.443 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:37.701 [2024-07-15 10:37:14.660409] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:37.701 [2024-07-15 10:37:14.660474] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:37.701 [2024-07-15 10:37:14.660563] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:37.701 "name": "raid_bdev1", 00:29:37.701 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:37.701 "strip_size_kb": 0, 00:29:37.701 "state": "online", 00:29:37.701 "raid_level": "raid1", 00:29:37.701 "superblock": true, 00:29:37.701 "num_base_bdevs": 2, 00:29:37.701 "num_base_bdevs_discovered": 2, 00:29:37.701 "num_base_bdevs_operational": 2, 00:29:37.701 "base_bdevs_list": [ 00:29:37.701 { 00:29:37.701 "name": "spare", 00:29:37.701 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:37.701 "is_configured": true, 00:29:37.701 "data_offset": 256, 00:29:37.701 "data_size": 7936 00:29:37.701 }, 00:29:37.701 { 00:29:37.701 "name": "BaseBdev2", 00:29:37.701 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:37.701 "is_configured": true, 00:29:37.701 "data_offset": 256, 00:29:37.701 "data_size": 7936 00:29:37.701 } 00:29:37.701 ] 00:29:37.701 }' 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.701 10:37:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:37.959 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:37.959 "name": "raid_bdev1", 00:29:37.959 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:37.959 "strip_size_kb": 0, 00:29:37.959 "state": "online", 00:29:37.959 "raid_level": "raid1", 00:29:37.959 "superblock": true, 00:29:37.959 "num_base_bdevs": 2, 00:29:37.959 "num_base_bdevs_discovered": 2, 00:29:37.959 "num_base_bdevs_operational": 2, 00:29:37.959 "base_bdevs_list": [ 00:29:37.959 { 00:29:37.959 "name": "spare", 00:29:37.959 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:37.959 "is_configured": true, 00:29:37.959 "data_offset": 256, 00:29:37.959 "data_size": 7936 00:29:37.959 }, 00:29:37.959 { 00:29:37.959 "name": "BaseBdev2", 00:29:37.959 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:37.959 "is_configured": true, 00:29:37.959 "data_offset": 256, 00:29:37.959 "data_size": 7936 00:29:37.959 } 00:29:37.959 ] 00:29:37.959 }' 00:29:37.959 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:37.959 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:37.959 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:38.216 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:38.216 "name": "raid_bdev1", 00:29:38.216 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:38.216 "strip_size_kb": 0, 00:29:38.216 "state": "online", 00:29:38.216 "raid_level": "raid1", 00:29:38.216 "superblock": true, 00:29:38.216 "num_base_bdevs": 2, 00:29:38.216 "num_base_bdevs_discovered": 2, 00:29:38.216 "num_base_bdevs_operational": 2, 00:29:38.216 "base_bdevs_list": [ 00:29:38.216 { 00:29:38.216 "name": "spare", 00:29:38.216 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:38.216 "is_configured": true, 00:29:38.216 "data_offset": 256, 00:29:38.216 "data_size": 7936 00:29:38.216 }, 00:29:38.216 { 00:29:38.216 "name": "BaseBdev2", 00:29:38.217 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:38.217 "is_configured": true, 00:29:38.217 "data_offset": 256, 00:29:38.217 "data_size": 7936 00:29:38.217 } 00:29:38.217 ] 00:29:38.217 }' 00:29:38.217 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:38.217 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:39.147 10:37:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:39.147 [2024-07-15 10:37:16.209142] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:39.147 [2024-07-15 10:37:16.209169] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:39.147 [2024-07-15 10:37:16.209226] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:39.147 [2024-07-15 10:37:16.209281] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:39.147 [2024-07-15 10:37:16.209294] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116c370 name raid_bdev1, state offline 00:29:39.147 10:37:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.147 10:37:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:29:39.404 10:37:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:39.404 10:37:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:29:39.404 10:37:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:39.404 10:37:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:39.661 10:37:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:40.225 [2024-07-15 10:37:17.199702] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:40.225 [2024-07-15 10:37:17.199745] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:40.225 [2024-07-15 10:37:17.199765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116c040 00:29:40.225 [2024-07-15 10:37:17.199784] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:40.225 [2024-07-15 10:37:17.201285] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:40.225 [2024-07-15 10:37:17.201312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:40.225 [2024-07-15 10:37:17.201367] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:40.225 [2024-07-15 10:37:17.201393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:40.225 [2024-07-15 10:37:17.201482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:40.225 spare 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:40.225 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:40.225 [2024-07-15 10:37:17.301789] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x116cf60 00:29:40.225 [2024-07-15 10:37:17.301803] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:29:40.225 [2024-07-15 10:37:17.301878] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x116cde0 00:29:40.225 [2024-07-15 10:37:17.301979] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116cf60 00:29:40.225 [2024-07-15 10:37:17.301990] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x116cf60 00:29:40.225 [2024-07-15 10:37:17.302057] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:40.482 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:40.482 "name": "raid_bdev1", 00:29:40.482 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:40.482 "strip_size_kb": 0, 00:29:40.482 "state": "online", 00:29:40.482 "raid_level": "raid1", 00:29:40.482 "superblock": true, 00:29:40.482 "num_base_bdevs": 2, 00:29:40.482 "num_base_bdevs_discovered": 2, 00:29:40.482 "num_base_bdevs_operational": 2, 00:29:40.482 "base_bdevs_list": [ 00:29:40.482 { 00:29:40.482 "name": "spare", 00:29:40.482 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:40.482 "is_configured": true, 00:29:40.482 "data_offset": 256, 00:29:40.482 "data_size": 7936 00:29:40.482 }, 00:29:40.482 { 00:29:40.482 "name": "BaseBdev2", 00:29:40.482 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:40.482 "is_configured": true, 00:29:40.482 "data_offset": 256, 00:29:40.482 "data_size": 7936 00:29:40.482 } 00:29:40.482 ] 00:29:40.482 }' 00:29:40.482 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:40.482 10:37:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:41.047 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:41.047 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:41.047 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:41.047 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:41.047 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:41.047 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:41.047 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:41.305 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:41.305 "name": "raid_bdev1", 00:29:41.305 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:41.305 "strip_size_kb": 0, 00:29:41.305 "state": "online", 00:29:41.305 "raid_level": "raid1", 00:29:41.305 "superblock": true, 00:29:41.305 "num_base_bdevs": 2, 00:29:41.305 "num_base_bdevs_discovered": 2, 00:29:41.305 "num_base_bdevs_operational": 2, 00:29:41.305 "base_bdevs_list": [ 00:29:41.305 { 00:29:41.305 "name": "spare", 00:29:41.305 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:41.305 "is_configured": true, 00:29:41.305 "data_offset": 256, 00:29:41.305 "data_size": 7936 00:29:41.305 }, 00:29:41.305 { 00:29:41.305 "name": "BaseBdev2", 00:29:41.305 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:41.305 "is_configured": true, 00:29:41.305 "data_offset": 256, 00:29:41.305 "data_size": 7936 00:29:41.305 } 00:29:41.305 ] 00:29:41.305 }' 00:29:41.305 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:41.305 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:41.305 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:41.305 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:41.305 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:41.305 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:41.562 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:41.562 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:41.820 [2024-07-15 10:37:18.860213] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:41.820 10:37:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.077 10:37:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.077 "name": "raid_bdev1", 00:29:42.077 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:42.077 "strip_size_kb": 0, 00:29:42.077 "state": "online", 00:29:42.077 "raid_level": "raid1", 00:29:42.077 "superblock": true, 00:29:42.077 "num_base_bdevs": 2, 00:29:42.077 "num_base_bdevs_discovered": 1, 00:29:42.077 "num_base_bdevs_operational": 1, 00:29:42.077 "base_bdevs_list": [ 00:29:42.077 { 00:29:42.077 "name": null, 00:29:42.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.077 "is_configured": false, 00:29:42.077 "data_offset": 256, 00:29:42.077 "data_size": 7936 00:29:42.077 }, 00:29:42.077 { 00:29:42.078 "name": "BaseBdev2", 00:29:42.078 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:42.078 "is_configured": true, 00:29:42.078 "data_offset": 256, 00:29:42.078 "data_size": 7936 00:29:42.078 } 00:29:42.078 ] 00:29:42.078 }' 00:29:42.078 10:37:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.078 10:37:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:42.642 10:37:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:42.900 [2024-07-15 10:37:19.939099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:42.900 [2024-07-15 10:37:19.939240] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:42.900 [2024-07-15 10:37:19.939257] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:42.900 [2024-07-15 10:37:19.939283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:42.900 [2024-07-15 10:37:19.942761] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x116e3a0 00:29:42.900 [2024-07-15 10:37:19.944172] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:42.900 10:37:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:43.831 10:37:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:43.831 10:37:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:43.831 10:37:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:43.831 10:37:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:43.831 10:37:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:43.831 10:37:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.831 10:37:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:44.089 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:44.089 "name": "raid_bdev1", 00:29:44.089 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:44.089 "strip_size_kb": 0, 00:29:44.089 "state": "online", 00:29:44.089 "raid_level": "raid1", 00:29:44.089 "superblock": true, 00:29:44.089 "num_base_bdevs": 2, 00:29:44.089 "num_base_bdevs_discovered": 2, 00:29:44.089 "num_base_bdevs_operational": 2, 00:29:44.089 "process": { 00:29:44.089 "type": "rebuild", 00:29:44.089 "target": "spare", 00:29:44.089 "progress": { 00:29:44.089 "blocks": 3072, 00:29:44.089 "percent": 38 00:29:44.089 } 00:29:44.089 }, 00:29:44.089 "base_bdevs_list": [ 00:29:44.089 { 00:29:44.089 "name": "spare", 00:29:44.089 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:44.089 "is_configured": true, 00:29:44.089 "data_offset": 256, 00:29:44.089 "data_size": 7936 00:29:44.089 }, 00:29:44.089 { 00:29:44.089 "name": "BaseBdev2", 00:29:44.089 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:44.089 "is_configured": true, 00:29:44.089 "data_offset": 256, 00:29:44.089 "data_size": 7936 00:29:44.089 } 00:29:44.089 ] 00:29:44.089 }' 00:29:44.089 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:44.089 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:44.089 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:44.346 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:44.346 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:44.603 [2024-07-15 10:37:21.790063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:44.861 [2024-07-15 10:37:21.859099] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:44.861 [2024-07-15 10:37:21.859140] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:44.861 [2024-07-15 10:37:21.859155] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:44.861 [2024-07-15 10:37:21.859164] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.861 10:37:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:45.119 10:37:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:45.119 "name": "raid_bdev1", 00:29:45.119 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:45.119 "strip_size_kb": 0, 00:29:45.119 "state": "online", 00:29:45.119 "raid_level": "raid1", 00:29:45.119 "superblock": true, 00:29:45.119 "num_base_bdevs": 2, 00:29:45.119 "num_base_bdevs_discovered": 1, 00:29:45.119 "num_base_bdevs_operational": 1, 00:29:45.119 "base_bdevs_list": [ 00:29:45.119 { 00:29:45.119 "name": null, 00:29:45.119 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:45.119 "is_configured": false, 00:29:45.119 "data_offset": 256, 00:29:45.119 "data_size": 7936 00:29:45.119 }, 00:29:45.119 { 00:29:45.119 "name": "BaseBdev2", 00:29:45.119 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:45.119 "is_configured": true, 00:29:45.119 "data_offset": 256, 00:29:45.119 "data_size": 7936 00:29:45.119 } 00:29:45.119 ] 00:29:45.119 }' 00:29:45.119 10:37:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:45.119 10:37:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:45.682 10:37:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:45.939 [2024-07-15 10:37:22.969992] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:45.939 [2024-07-15 10:37:22.970041] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:45.939 [2024-07-15 10:37:22.970063] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x116bc80 00:29:45.939 [2024-07-15 10:37:22.970077] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:45.939 [2024-07-15 10:37:22.970271] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:45.939 [2024-07-15 10:37:22.970287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:45.939 [2024-07-15 10:37:22.970341] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:45.939 [2024-07-15 10:37:22.970357] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:45.939 [2024-07-15 10:37:22.970368] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:45.939 [2024-07-15 10:37:22.970386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:45.939 [2024-07-15 10:37:22.973883] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x116c2d0 00:29:45.939 [2024-07-15 10:37:22.975218] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:45.939 spare 00:29:45.939 10:37:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:46.880 10:37:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:46.880 10:37:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:46.880 10:37:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:46.880 10:37:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:46.880 10:37:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:46.880 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.881 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:47.160 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:47.160 "name": "raid_bdev1", 00:29:47.160 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:47.160 "strip_size_kb": 0, 00:29:47.160 "state": "online", 00:29:47.160 "raid_level": "raid1", 00:29:47.160 "superblock": true, 00:29:47.160 "num_base_bdevs": 2, 00:29:47.160 "num_base_bdevs_discovered": 2, 00:29:47.160 "num_base_bdevs_operational": 2, 00:29:47.160 "process": { 00:29:47.160 "type": "rebuild", 00:29:47.160 "target": "spare", 00:29:47.160 "progress": { 00:29:47.160 "blocks": 3072, 00:29:47.160 "percent": 38 00:29:47.160 } 00:29:47.160 }, 00:29:47.160 "base_bdevs_list": [ 00:29:47.160 { 00:29:47.160 "name": "spare", 00:29:47.160 "uuid": "a0f0b205-967d-5b93-97e7-808a9dab2f6d", 00:29:47.160 "is_configured": true, 00:29:47.160 "data_offset": 256, 00:29:47.160 "data_size": 7936 00:29:47.160 }, 00:29:47.160 { 00:29:47.160 "name": "BaseBdev2", 00:29:47.160 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:47.160 "is_configured": true, 00:29:47.160 "data_offset": 256, 00:29:47.160 "data_size": 7936 00:29:47.160 } 00:29:47.160 ] 00:29:47.160 }' 00:29:47.160 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:47.160 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:47.160 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:47.160 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:47.160 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:47.416 [2024-07-15 10:37:24.565040] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:47.416 [2024-07-15 10:37:24.587905] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:47.416 [2024-07-15 10:37:24.587951] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:47.416 [2024-07-15 10:37:24.587966] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:47.416 [2024-07-15 10:37:24.587975] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:47.416 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:47.416 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:47.416 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:47.416 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:47.416 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:47.416 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:47.673 "name": "raid_bdev1", 00:29:47.673 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:47.673 "strip_size_kb": 0, 00:29:47.673 "state": "online", 00:29:47.673 "raid_level": "raid1", 00:29:47.673 "superblock": true, 00:29:47.673 "num_base_bdevs": 2, 00:29:47.673 "num_base_bdevs_discovered": 1, 00:29:47.673 "num_base_bdevs_operational": 1, 00:29:47.673 "base_bdevs_list": [ 00:29:47.673 { 00:29:47.673 "name": null, 00:29:47.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:47.673 "is_configured": false, 00:29:47.673 "data_offset": 256, 00:29:47.673 "data_size": 7936 00:29:47.673 }, 00:29:47.673 { 00:29:47.673 "name": "BaseBdev2", 00:29:47.673 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:47.673 "is_configured": true, 00:29:47.673 "data_offset": 256, 00:29:47.673 "data_size": 7936 00:29:47.673 } 00:29:47.673 ] 00:29:47.673 }' 00:29:47.673 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:47.674 10:37:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:48.603 "name": "raid_bdev1", 00:29:48.603 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:48.603 "strip_size_kb": 0, 00:29:48.603 "state": "online", 00:29:48.603 "raid_level": "raid1", 00:29:48.603 "superblock": true, 00:29:48.603 "num_base_bdevs": 2, 00:29:48.603 "num_base_bdevs_discovered": 1, 00:29:48.603 "num_base_bdevs_operational": 1, 00:29:48.603 "base_bdevs_list": [ 00:29:48.603 { 00:29:48.603 "name": null, 00:29:48.603 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:48.603 "is_configured": false, 00:29:48.603 "data_offset": 256, 00:29:48.603 "data_size": 7936 00:29:48.603 }, 00:29:48.603 { 00:29:48.603 "name": "BaseBdev2", 00:29:48.603 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:48.603 "is_configured": true, 00:29:48.603 "data_offset": 256, 00:29:48.603 "data_size": 7936 00:29:48.603 } 00:29:48.603 ] 00:29:48.603 }' 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:48.603 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:48.860 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:48.860 10:37:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:48.860 10:37:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:49.118 [2024-07-15 10:37:26.280757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:49.118 [2024-07-15 10:37:26.280807] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:49.118 [2024-07-15 10:37:26.280830] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfd27e0 00:29:49.118 [2024-07-15 10:37:26.280843] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:49.118 [2024-07-15 10:37:26.281018] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:49.118 [2024-07-15 10:37:26.281034] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:49.118 [2024-07-15 10:37:26.281080] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:49.118 [2024-07-15 10:37:26.281091] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:49.118 [2024-07-15 10:37:26.281102] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:49.118 BaseBdev1 00:29:49.118 10:37:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:50.488 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:50.488 "name": "raid_bdev1", 00:29:50.489 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:50.489 "strip_size_kb": 0, 00:29:50.489 "state": "online", 00:29:50.489 "raid_level": "raid1", 00:29:50.489 "superblock": true, 00:29:50.489 "num_base_bdevs": 2, 00:29:50.489 "num_base_bdevs_discovered": 1, 00:29:50.489 "num_base_bdevs_operational": 1, 00:29:50.489 "base_bdevs_list": [ 00:29:50.489 { 00:29:50.489 "name": null, 00:29:50.489 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:50.489 "is_configured": false, 00:29:50.489 "data_offset": 256, 00:29:50.489 "data_size": 7936 00:29:50.489 }, 00:29:50.489 { 00:29:50.489 "name": "BaseBdev2", 00:29:50.489 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:50.489 "is_configured": true, 00:29:50.489 "data_offset": 256, 00:29:50.489 "data_size": 7936 00:29:50.489 } 00:29:50.489 ] 00:29:50.489 }' 00:29:50.489 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:50.489 10:37:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:51.053 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:51.053 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:51.053 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:51.053 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:51.053 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:51.053 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:51.053 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:51.311 "name": "raid_bdev1", 00:29:51.311 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:51.311 "strip_size_kb": 0, 00:29:51.311 "state": "online", 00:29:51.311 "raid_level": "raid1", 00:29:51.311 "superblock": true, 00:29:51.311 "num_base_bdevs": 2, 00:29:51.311 "num_base_bdevs_discovered": 1, 00:29:51.311 "num_base_bdevs_operational": 1, 00:29:51.311 "base_bdevs_list": [ 00:29:51.311 { 00:29:51.311 "name": null, 00:29:51.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:51.311 "is_configured": false, 00:29:51.311 "data_offset": 256, 00:29:51.311 "data_size": 7936 00:29:51.311 }, 00:29:51.311 { 00:29:51.311 "name": "BaseBdev2", 00:29:51.311 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:51.311 "is_configured": true, 00:29:51.311 "data_offset": 256, 00:29:51.311 "data_size": 7936 00:29:51.311 } 00:29:51.311 ] 00:29:51.311 }' 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:51.311 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:51.568 [2024-07-15 10:37:28.707218] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:51.568 [2024-07-15 10:37:28.707346] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:51.568 [2024-07-15 10:37:28.707362] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:51.568 request: 00:29:51.568 { 00:29:51.568 "base_bdev": "BaseBdev1", 00:29:51.568 "raid_bdev": "raid_bdev1", 00:29:51.568 "method": "bdev_raid_add_base_bdev", 00:29:51.568 "req_id": 1 00:29:51.568 } 00:29:51.568 Got JSON-RPC error response 00:29:51.568 response: 00:29:51.568 { 00:29:51.568 "code": -22, 00:29:51.568 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:51.568 } 00:29:51.568 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:29:51.568 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:51.568 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:51.568 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:51.568 10:37:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.592 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:52.849 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:52.849 "name": "raid_bdev1", 00:29:52.849 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:52.849 "strip_size_kb": 0, 00:29:52.849 "state": "online", 00:29:52.849 "raid_level": "raid1", 00:29:52.849 "superblock": true, 00:29:52.849 "num_base_bdevs": 2, 00:29:52.849 "num_base_bdevs_discovered": 1, 00:29:52.849 "num_base_bdevs_operational": 1, 00:29:52.849 "base_bdevs_list": [ 00:29:52.849 { 00:29:52.849 "name": null, 00:29:52.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:52.849 "is_configured": false, 00:29:52.849 "data_offset": 256, 00:29:52.849 "data_size": 7936 00:29:52.849 }, 00:29:52.849 { 00:29:52.849 "name": "BaseBdev2", 00:29:52.849 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:52.849 "is_configured": true, 00:29:52.849 "data_offset": 256, 00:29:52.849 "data_size": 7936 00:29:52.849 } 00:29:52.849 ] 00:29:52.849 }' 00:29:52.849 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:52.849 10:37:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:53.414 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:53.414 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:53.414 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:53.414 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:53.414 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:53.414 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.414 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:53.671 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:53.671 "name": "raid_bdev1", 00:29:53.671 "uuid": "699b37ff-6fb5-412a-96ca-b0c09e506159", 00:29:53.671 "strip_size_kb": 0, 00:29:53.671 "state": "online", 00:29:53.671 "raid_level": "raid1", 00:29:53.671 "superblock": true, 00:29:53.671 "num_base_bdevs": 2, 00:29:53.671 "num_base_bdevs_discovered": 1, 00:29:53.671 "num_base_bdevs_operational": 1, 00:29:53.671 "base_bdevs_list": [ 00:29:53.671 { 00:29:53.671 "name": null, 00:29:53.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:53.671 "is_configured": false, 00:29:53.671 "data_offset": 256, 00:29:53.671 "data_size": 7936 00:29:53.671 }, 00:29:53.671 { 00:29:53.671 "name": "BaseBdev2", 00:29:53.671 "uuid": "642dcbea-5259-5657-ad8b-aed1efd7626b", 00:29:53.671 "is_configured": true, 00:29:53.671 "data_offset": 256, 00:29:53.671 "data_size": 7936 00:29:53.671 } 00:29:53.671 ] 00:29:53.671 }' 00:29:53.671 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:53.671 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:53.671 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 633402 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 633402 ']' 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 633402 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 633402 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 633402' 00:29:53.951 killing process with pid 633402 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 633402 00:29:53.951 Received shutdown signal, test time was about 60.000000 seconds 00:29:53.951 00:29:53.951 Latency(us) 00:29:53.951 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:53.951 =================================================================================================================== 00:29:53.951 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:53.951 [2024-07-15 10:37:30.960560] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:53.951 [2024-07-15 10:37:30.960649] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:53.951 [2024-07-15 10:37:30.960694] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:53.951 [2024-07-15 10:37:30.960707] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116cf60 name raid_bdev1, state offline 00:29:53.951 10:37:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 633402 00:29:53.951 [2024-07-15 10:37:30.991201] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:54.209 10:37:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:29:54.209 00:29:54.209 real 0m28.266s 00:29:54.209 user 0m45.691s 00:29:54.209 sys 0m3.890s 00:29:54.209 10:37:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:54.209 10:37:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:29:54.209 ************************************ 00:29:54.209 END TEST raid_rebuild_test_sb_md_interleaved 00:29:54.209 ************************************ 00:29:54.209 10:37:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:54.209 10:37:31 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:29:54.209 10:37:31 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:29:54.209 10:37:31 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 633402 ']' 00:29:54.209 10:37:31 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 633402 00:29:54.209 10:37:31 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:29:54.209 00:29:54.209 real 18m25.830s 00:29:54.209 user 31m16.822s 00:29:54.209 sys 3m20.377s 00:29:54.209 10:37:31 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:54.209 10:37:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:54.209 ************************************ 00:29:54.209 END TEST bdev_raid 00:29:54.209 ************************************ 00:29:54.209 10:37:31 -- common/autotest_common.sh@1142 -- # return 0 00:29:54.209 10:37:31 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:54.209 10:37:31 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:29:54.209 10:37:31 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:54.209 10:37:31 -- common/autotest_common.sh@10 -- # set +x 00:29:54.209 ************************************ 00:29:54.209 START TEST bdevperf_config 00:29:54.209 ************************************ 00:29:54.209 10:37:31 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:29:54.466 * Looking for test storage... 00:29:54.466 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:54.466 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:29:54.466 10:37:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:54.467 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:54.467 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:54.467 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:29:54.467 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:29:54.467 10:37:31 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:57.742 10:37:34 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 10:37:31.566586] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:29:57.742 [2024-07-15 10:37:31.566652] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637544 ] 00:29:57.742 Using job config with 4 jobs 00:29:57.742 [2024-07-15 10:37:31.710442] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.742 [2024-07-15 10:37:31.822795] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.742 cpumask for '\''job0'\'' is too big 00:29:57.742 cpumask for '\''job1'\'' is too big 00:29:57.742 cpumask for '\''job2'\'' is too big 00:29:57.742 cpumask for '\''job3'\'' is too big 00:29:57.742 Running I/O for 2 seconds... 00:29:57.742 00:29:57.742 Latency(us) 00:29:57.742 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24098.34 23.53 0.00 0.00 10614.74 1852.10 16298.52 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24076.35 23.51 0.00 0.00 10601.27 1837.86 14417.92 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24054.45 23.49 0.00 0.00 10586.85 1837.86 12594.31 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24032.62 23.47 0.00 0.00 10572.32 1837.86 10884.67 00:29:57.742 =================================================================================================================== 00:29:57.742 Total : 96261.76 94.01 0.00 0.00 10593.79 1837.86 16298.52' 00:29:57.742 10:37:34 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 10:37:31.566586] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:29:57.742 [2024-07-15 10:37:31.566652] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637544 ] 00:29:57.742 Using job config with 4 jobs 00:29:57.742 [2024-07-15 10:37:31.710442] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.742 [2024-07-15 10:37:31.822795] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.742 cpumask for '\''job0'\'' is too big 00:29:57.742 cpumask for '\''job1'\'' is too big 00:29:57.742 cpumask for '\''job2'\'' is too big 00:29:57.742 cpumask for '\''job3'\'' is too big 00:29:57.742 Running I/O for 2 seconds... 00:29:57.742 00:29:57.742 Latency(us) 00:29:57.742 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24098.34 23.53 0.00 0.00 10614.74 1852.10 16298.52 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24076.35 23.51 0.00 0.00 10601.27 1837.86 14417.92 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24054.45 23.49 0.00 0.00 10586.85 1837.86 12594.31 00:29:57.742 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.742 Malloc0 : 2.02 24032.62 23.47 0.00 0.00 10572.32 1837.86 10884.67 00:29:57.743 =================================================================================================================== 00:29:57.743 Total : 96261.76 94.01 0.00 0.00 10593.79 1837.86 16298.52' 00:29:57.743 10:37:34 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 10:37:31.566586] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:29:57.743 [2024-07-15 10:37:31.566652] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637544 ] 00:29:57.743 Using job config with 4 jobs 00:29:57.743 [2024-07-15 10:37:31.710442] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.743 [2024-07-15 10:37:31.822795] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.743 cpumask for '\''job0'\'' is too big 00:29:57.743 cpumask for '\''job1'\'' is too big 00:29:57.743 cpumask for '\''job2'\'' is too big 00:29:57.743 cpumask for '\''job3'\'' is too big 00:29:57.743 Running I/O for 2 seconds... 00:29:57.743 00:29:57.743 Latency(us) 00:29:57.743 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:57.743 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.743 Malloc0 : 2.02 24098.34 23.53 0.00 0.00 10614.74 1852.10 16298.52 00:29:57.743 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.743 Malloc0 : 2.02 24076.35 23.51 0.00 0.00 10601.27 1837.86 14417.92 00:29:57.743 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.743 Malloc0 : 2.02 24054.45 23.49 0.00 0.00 10586.85 1837.86 12594.31 00:29:57.743 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:29:57.743 Malloc0 : 2.02 24032.62 23.47 0.00 0.00 10572.32 1837.86 10884.67 00:29:57.743 =================================================================================================================== 00:29:57.743 Total : 96261.76 94.01 0.00 0.00 10593.79 1837.86 16298.52' 00:29:57.743 10:37:34 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:29:57.743 10:37:34 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:29:57.743 10:37:34 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:29:57.743 10:37:34 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:29:57.743 [2024-07-15 10:37:34.329028] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:29:57.743 [2024-07-15 10:37:34.329094] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid637894 ] 00:29:57.743 [2024-07-15 10:37:34.469002] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.743 [2024-07-15 10:37:34.585312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:57.743 cpumask for 'job0' is too big 00:29:57.743 cpumask for 'job1' is too big 00:29:57.743 cpumask for 'job2' is too big 00:29:57.743 cpumask for 'job3' is too big 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:30:00.269 Running I/O for 2 seconds... 00:30:00.269 00:30:00.269 Latency(us) 00:30:00.269 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:00.269 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:00.269 Malloc0 : 2.02 24102.61 23.54 0.00 0.00 10613.13 1866.35 16298.52 00:30:00.269 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:00.269 Malloc0 : 2.02 24080.54 23.52 0.00 0.00 10598.38 1837.86 14417.92 00:30:00.269 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:00.269 Malloc0 : 2.02 24058.64 23.49 0.00 0.00 10584.54 1837.86 12594.31 00:30:00.269 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:30:00.269 Malloc0 : 2.02 24036.86 23.47 0.00 0.00 10570.20 1837.86 10884.67 00:30:00.269 =================================================================================================================== 00:30:00.269 Total : 96278.65 94.02 0.00 0.00 10591.56 1837.86 16298.52' 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:00.269 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:00.269 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:30:00.269 10:37:37 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:00.270 10:37:37 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:00.270 10:37:37 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:00.270 00:30:00.270 10:37:37 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:00.270 10:37:37 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 10:37:37.090566] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:02.797 [2024-07-15 10:37:37.090631] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638252 ] 00:30:02.797 Using job config with 3 jobs 00:30:02.797 [2024-07-15 10:37:37.230777] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.797 [2024-07-15 10:37:37.338357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.797 cpumask for '\''job0'\'' is too big 00:30:02.797 cpumask for '\''job1'\'' is too big 00:30:02.797 cpumask for '\''job2'\'' is too big 00:30:02.797 Running I/O for 2 seconds... 00:30:02.797 00:30:02.797 Latency(us) 00:30:02.797 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.01 32537.91 31.78 0.00 0.00 7858.35 1809.36 11568.53 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.02 32508.03 31.75 0.00 0.00 7847.63 1802.24 9744.92 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.02 32478.37 31.72 0.00 0.00 7837.41 1787.99 8149.26 00:30:02.797 =================================================================================================================== 00:30:02.797 Total : 97524.32 95.24 0.00 0.00 7847.80 1787.99 11568.53' 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 10:37:37.090566] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:02.797 [2024-07-15 10:37:37.090631] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638252 ] 00:30:02.797 Using job config with 3 jobs 00:30:02.797 [2024-07-15 10:37:37.230777] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.797 [2024-07-15 10:37:37.338357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.797 cpumask for '\''job0'\'' is too big 00:30:02.797 cpumask for '\''job1'\'' is too big 00:30:02.797 cpumask for '\''job2'\'' is too big 00:30:02.797 Running I/O for 2 seconds... 00:30:02.797 00:30:02.797 Latency(us) 00:30:02.797 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.01 32537.91 31.78 0.00 0.00 7858.35 1809.36 11568.53 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.02 32508.03 31.75 0.00 0.00 7847.63 1802.24 9744.92 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.02 32478.37 31.72 0.00 0.00 7837.41 1787.99 8149.26 00:30:02.797 =================================================================================================================== 00:30:02.797 Total : 97524.32 95.24 0.00 0.00 7847.80 1787.99 11568.53' 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 10:37:37.090566] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:02.797 [2024-07-15 10:37:37.090631] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638252 ] 00:30:02.797 Using job config with 3 jobs 00:30:02.797 [2024-07-15 10:37:37.230777] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.797 [2024-07-15 10:37:37.338357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.797 cpumask for '\''job0'\'' is too big 00:30:02.797 cpumask for '\''job1'\'' is too big 00:30:02.797 cpumask for '\''job2'\'' is too big 00:30:02.797 Running I/O for 2 seconds... 00:30:02.797 00:30:02.797 Latency(us) 00:30:02.797 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.01 32537.91 31.78 0.00 0.00 7858.35 1809.36 11568.53 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.02 32508.03 31.75 0.00 0.00 7847.63 1802.24 9744.92 00:30:02.797 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:30:02.797 Malloc0 : 2.02 32478.37 31.72 0.00 0.00 7837.41 1787.99 8149.26 00:30:02.797 =================================================================================================================== 00:30:02.797 Total : 97524.32 95.24 0.00 0.00 7847.80 1787.99 11568.53' 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:02.797 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:02.797 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:30:02.797 10:37:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:02.798 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:02.798 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:30:02.798 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:30:02.798 10:37:39 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:06.080 10:37:42 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 10:37:39.868239] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:06.080 [2024-07-15 10:37:39.868306] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638613 ] 00:30:06.080 Using job config with 4 jobs 00:30:06.080 [2024-07-15 10:37:40.013931] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:06.080 [2024-07-15 10:37:40.135560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:06.080 cpumask for '\''job0'\'' is too big 00:30:06.080 cpumask for '\''job1'\'' is too big 00:30:06.080 cpumask for '\''job2'\'' is too big 00:30:06.080 cpumask for '\''job3'\'' is too big 00:30:06.080 Running I/O for 2 seconds... 00:30:06.080 00:30:06.080 Latency(us) 00:30:06.080 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:06.080 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.080 Malloc0 : 2.03 11997.76 11.72 0.00 0.00 21321.87 3846.68 33052.94 00:30:06.080 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.080 Malloc1 : 2.03 11986.53 11.71 0.00 0.00 21319.67 4644.51 33052.94 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.03 11975.65 11.69 0.00 0.00 21262.00 3818.18 29177.77 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 12011.52 11.73 0.00 0.00 21178.47 4616.01 29177.77 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.05 12000.71 11.72 0.00 0.00 21122.91 3789.69 25416.57 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 11989.68 11.71 0.00 0.00 21121.68 4616.01 25416.57 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.05 11978.91 11.70 0.00 0.00 21067.42 3789.69 21655.37 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 11967.90 11.69 0.00 0.00 21068.87 4616.01 21655.37 00:30:06.081 =================================================================================================================== 00:30:06.081 Total : 95908.65 93.66 0.00 0.00 21182.40 3789.69 33052.94' 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 10:37:39.868239] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:06.081 [2024-07-15 10:37:39.868306] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638613 ] 00:30:06.081 Using job config with 4 jobs 00:30:06.081 [2024-07-15 10:37:40.013931] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:06.081 [2024-07-15 10:37:40.135560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:06.081 cpumask for '\''job0'\'' is too big 00:30:06.081 cpumask for '\''job1'\'' is too big 00:30:06.081 cpumask for '\''job2'\'' is too big 00:30:06.081 cpumask for '\''job3'\'' is too big 00:30:06.081 Running I/O for 2 seconds... 00:30:06.081 00:30:06.081 Latency(us) 00:30:06.081 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.03 11997.76 11.72 0.00 0.00 21321.87 3846.68 33052.94 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.03 11986.53 11.71 0.00 0.00 21319.67 4644.51 33052.94 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.03 11975.65 11.69 0.00 0.00 21262.00 3818.18 29177.77 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 12011.52 11.73 0.00 0.00 21178.47 4616.01 29177.77 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.05 12000.71 11.72 0.00 0.00 21122.91 3789.69 25416.57 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 11989.68 11.71 0.00 0.00 21121.68 4616.01 25416.57 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.05 11978.91 11.70 0.00 0.00 21067.42 3789.69 21655.37 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 11967.90 11.69 0.00 0.00 21068.87 4616.01 21655.37 00:30:06.081 =================================================================================================================== 00:30:06.081 Total : 95908.65 93.66 0.00 0.00 21182.40 3789.69 33052.94' 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 10:37:39.868239] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:06.081 [2024-07-15 10:37:39.868306] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid638613 ] 00:30:06.081 Using job config with 4 jobs 00:30:06.081 [2024-07-15 10:37:40.013931] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:06.081 [2024-07-15 10:37:40.135560] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:06.081 cpumask for '\''job0'\'' is too big 00:30:06.081 cpumask for '\''job1'\'' is too big 00:30:06.081 cpumask for '\''job2'\'' is too big 00:30:06.081 cpumask for '\''job3'\'' is too big 00:30:06.081 Running I/O for 2 seconds... 00:30:06.081 00:30:06.081 Latency(us) 00:30:06.081 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.03 11997.76 11.72 0.00 0.00 21321.87 3846.68 33052.94 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.03 11986.53 11.71 0.00 0.00 21319.67 4644.51 33052.94 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.03 11975.65 11.69 0.00 0.00 21262.00 3818.18 29177.77 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 12011.52 11.73 0.00 0.00 21178.47 4616.01 29177.77 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.05 12000.71 11.72 0.00 0.00 21122.91 3789.69 25416.57 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 11989.68 11.71 0.00 0.00 21121.68 4616.01 25416.57 00:30:06.081 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc0 : 2.05 11978.91 11.70 0.00 0.00 21067.42 3789.69 21655.37 00:30:06.081 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:30:06.081 Malloc1 : 2.05 11967.90 11.69 0.00 0.00 21068.87 4616.01 21655.37 00:30:06.081 =================================================================================================================== 00:30:06.081 Total : 95908.65 93.66 0.00 0.00 21182.40 3789.69 33052.94' 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:30:06.081 10:37:42 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:30:06.081 00:30:06.081 real 0m11.235s 00:30:06.081 user 0m9.896s 00:30:06.081 sys 0m1.185s 00:30:06.081 10:37:42 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:06.081 10:37:42 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:30:06.081 ************************************ 00:30:06.081 END TEST bdevperf_config 00:30:06.081 ************************************ 00:30:06.081 10:37:42 -- common/autotest_common.sh@1142 -- # return 0 00:30:06.081 10:37:42 -- spdk/autotest.sh@192 -- # uname -s 00:30:06.081 10:37:42 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:30:06.081 10:37:42 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:06.081 10:37:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:06.081 10:37:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:06.081 10:37:42 -- common/autotest_common.sh@10 -- # set +x 00:30:06.081 ************************************ 00:30:06.081 START TEST reactor_set_interrupt 00:30:06.081 ************************************ 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:06.081 * Looking for test storage... 00:30:06.081 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.081 10:37:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:06.081 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:30:06.081 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.081 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.081 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:06.081 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:06.081 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:06.081 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:06.081 10:37:42 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:06.082 10:37:42 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:06.082 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:06.082 10:37:42 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:06.082 #define SPDK_CONFIG_H 00:30:06.082 #define SPDK_CONFIG_APPS 1 00:30:06.082 #define SPDK_CONFIG_ARCH native 00:30:06.082 #undef SPDK_CONFIG_ASAN 00:30:06.082 #undef SPDK_CONFIG_AVAHI 00:30:06.082 #undef SPDK_CONFIG_CET 00:30:06.082 #define SPDK_CONFIG_COVERAGE 1 00:30:06.082 #define SPDK_CONFIG_CROSS_PREFIX 00:30:06.082 #define SPDK_CONFIG_CRYPTO 1 00:30:06.082 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:06.082 #undef SPDK_CONFIG_CUSTOMOCF 00:30:06.082 #undef SPDK_CONFIG_DAOS 00:30:06.082 #define SPDK_CONFIG_DAOS_DIR 00:30:06.082 #define SPDK_CONFIG_DEBUG 1 00:30:06.082 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:06.082 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:06.082 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:06.082 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:06.082 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:06.082 #undef SPDK_CONFIG_DPDK_UADK 00:30:06.082 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:06.082 #define SPDK_CONFIG_EXAMPLES 1 00:30:06.082 #undef SPDK_CONFIG_FC 00:30:06.082 #define SPDK_CONFIG_FC_PATH 00:30:06.082 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:06.082 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:06.082 #undef SPDK_CONFIG_FUSE 00:30:06.082 #undef SPDK_CONFIG_FUZZER 00:30:06.082 #define SPDK_CONFIG_FUZZER_LIB 00:30:06.082 #undef SPDK_CONFIG_GOLANG 00:30:06.082 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:06.082 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:06.082 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:06.082 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:06.082 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:06.082 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:06.082 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:06.082 #define SPDK_CONFIG_IDXD 1 00:30:06.082 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:06.082 #define SPDK_CONFIG_IPSEC_MB 1 00:30:06.082 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:06.082 #define SPDK_CONFIG_ISAL 1 00:30:06.082 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:06.082 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:06.082 #define SPDK_CONFIG_LIBDIR 00:30:06.082 #undef SPDK_CONFIG_LTO 00:30:06.082 #define SPDK_CONFIG_MAX_LCORES 128 00:30:06.082 #define SPDK_CONFIG_NVME_CUSE 1 00:30:06.082 #undef SPDK_CONFIG_OCF 00:30:06.082 #define SPDK_CONFIG_OCF_PATH 00:30:06.082 #define SPDK_CONFIG_OPENSSL_PATH 00:30:06.082 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:06.082 #define SPDK_CONFIG_PGO_DIR 00:30:06.082 #undef SPDK_CONFIG_PGO_USE 00:30:06.082 #define SPDK_CONFIG_PREFIX /usr/local 00:30:06.082 #undef SPDK_CONFIG_RAID5F 00:30:06.082 #undef SPDK_CONFIG_RBD 00:30:06.082 #define SPDK_CONFIG_RDMA 1 00:30:06.082 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:06.082 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:06.082 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:06.082 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:06.082 #define SPDK_CONFIG_SHARED 1 00:30:06.082 #undef SPDK_CONFIG_SMA 00:30:06.082 #define SPDK_CONFIG_TESTS 1 00:30:06.082 #undef SPDK_CONFIG_TSAN 00:30:06.082 #define SPDK_CONFIG_UBLK 1 00:30:06.082 #define SPDK_CONFIG_UBSAN 1 00:30:06.082 #undef SPDK_CONFIG_UNIT_TESTS 00:30:06.082 #undef SPDK_CONFIG_URING 00:30:06.082 #define SPDK_CONFIG_URING_PATH 00:30:06.082 #undef SPDK_CONFIG_URING_ZNS 00:30:06.083 #undef SPDK_CONFIG_USDT 00:30:06.083 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:06.083 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:06.083 #undef SPDK_CONFIG_VFIO_USER 00:30:06.083 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:06.083 #define SPDK_CONFIG_VHOST 1 00:30:06.083 #define SPDK_CONFIG_VIRTIO 1 00:30:06.083 #undef SPDK_CONFIG_VTUNE 00:30:06.083 #define SPDK_CONFIG_VTUNE_DIR 00:30:06.083 #define SPDK_CONFIG_WERROR 1 00:30:06.083 #define SPDK_CONFIG_WPDK_DIR 00:30:06.083 #undef SPDK_CONFIG_XNVME 00:30:06.083 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:06.083 10:37:42 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:06.083 10:37:42 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:06.083 10:37:42 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:06.083 10:37:42 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:06.083 10:37:42 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:30:06.083 10:37:42 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:06.083 10:37:42 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:30:06.083 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 639007 ]] 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 639007 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.RRiC8S 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:06.084 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.RRiC8S/tests/interrupt /tmp/spdk.RRiC8S 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=88758792192 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5749723136 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892275712 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9428992 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253438464 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=819200 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:06.085 * Looking for test storage... 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=88758792192 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7964315648 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.085 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:06.085 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:06.085 10:37:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:06.086 10:37:42 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:30:06.086 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:06.086 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:06.086 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=639070 00:30:06.086 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:06.086 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:06.086 10:37:42 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 639070 /var/tmp/spdk.sock 00:30:06.086 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 639070 ']' 00:30:06.086 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:06.086 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:06.086 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:06.086 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:06.086 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:06.086 10:37:42 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:06.086 [2024-07-15 10:37:43.010166] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:06.086 [2024-07-15 10:37:43.010232] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639070 ] 00:30:06.086 [2024-07-15 10:37:43.136779] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:06.086 [2024-07-15 10:37:43.247022] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:06.086 [2024-07-15 10:37:43.247109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:06.086 [2024-07-15 10:37:43.247114] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:06.344 [2024-07-15 10:37:43.328535] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:06.909 10:37:43 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:06.909 10:37:43 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:06.909 10:37:43 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:30:06.909 10:37:43 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:07.166 Malloc0 00:30:07.166 Malloc1 00:30:07.166 Malloc2 00:30:07.166 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:30:07.166 10:37:44 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:07.166 10:37:44 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:07.166 10:37:44 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:07.166 5000+0 records in 00:30:07.166 5000+0 records out 00:30:07.166 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0248007 s, 413 MB/s 00:30:07.166 10:37:44 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:07.423 AIO0 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 639070 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 639070 without_thd 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=639070 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:07.423 10:37:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:07.680 10:37:44 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:07.938 spdk_thread ids are 1 on reactor0. 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 639070 0 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639070 0 idle 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639070 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:07.938 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639070 -w 256 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639070 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.40 reactor_0' 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639070 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.40 reactor_0 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 639070 1 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639070 1 idle 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639070 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639070 -w 256 00:30:08.196 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639121 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639121 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 639070 2 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639070 2 idle 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639070 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639070 -w 256 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639122 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639122 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:30:08.455 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:30:08.713 [2024-07-15 10:37:45.827945] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:08.713 10:37:45 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:09.279 [2024-07-15 10:37:46.331631] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:09.279 [2024-07-15 10:37:46.331915] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:09.279 10:37:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:09.536 [2024-07-15 10:37:46.591618] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:09.536 [2024-07-15 10:37:46.591749] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 639070 0 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 639070 0 busy 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639070 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639070 -w 256 00:30:09.536 10:37:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639070 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0' 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639070 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.85 reactor_0 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 639070 2 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 639070 2 busy 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639070 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639070 -w 256 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639122 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2' 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639122 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.36 reactor_2 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:09.792 10:37:46 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:10.049 [2024-07-15 10:37:47.199616] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:10.049 [2024-07-15 10:37:47.199727] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 639070 2 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639070 2 idle 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639070 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639070 -w 256 00:30:10.049 10:37:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639122 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2' 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639122 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.60 reactor_2 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:10.306 10:37:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:10.564 [2024-07-15 10:37:47.627606] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:10.564 [2024-07-15 10:37:47.627723] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:10.564 10:37:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:30:10.564 10:37:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:30:10.564 10:37:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:30:10.822 [2024-07-15 10:37:47.872054] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:10.822 10:37:47 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 639070 0 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639070 0 idle 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639070 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:10.823 10:37:47 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639070 -w 256 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639070 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.71 reactor_0' 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639070 root 20 0 128.2g 36864 23616 S 6.7 0.0 0:01.71 reactor_0 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:30:11.081 10:37:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 639070 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 639070 ']' 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 639070 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 639070 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 639070' 00:30:11.081 killing process with pid 639070 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 639070 00:30:11.081 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 639070 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=639821 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:11.339 10:37:48 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 639821 /var/tmp/spdk.sock 00:30:11.339 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 639821 ']' 00:30:11.339 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:11.339 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:11.339 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:11.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:11.339 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:11.339 10:37:48 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:11.339 [2024-07-15 10:37:48.449838] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:11.339 [2024-07-15 10:37:48.449910] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid639821 ] 00:30:11.597 [2024-07-15 10:37:48.579503] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:11.597 [2024-07-15 10:37:48.685377] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:11.597 [2024-07-15 10:37:48.685461] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:11.597 [2024-07-15 10:37:48.685465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:11.597 [2024-07-15 10:37:48.759958] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:12.529 10:37:49 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:12.529 10:37:49 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:30:12.529 10:37:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:30:12.529 10:37:49 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:12.529 Malloc0 00:30:12.529 Malloc1 00:30:12.529 Malloc2 00:30:12.529 10:37:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:30:12.529 10:37:49 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:30:12.529 10:37:49 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:12.529 10:37:49 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:12.786 5000+0 records in 00:30:12.786 5000+0 records out 00:30:12.786 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0281464 s, 364 MB/s 00:30:12.786 10:37:49 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:13.043 AIO0 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 639821 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 639821 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=639821 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:13.043 10:37:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:30:13.300 10:37:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:30:13.609 10:37:50 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:30:13.610 spdk_thread ids are 1 on reactor0. 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 639821 0 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639821 0 idle 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639821 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639821 -w 256 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639821 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.40 reactor_0' 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639821 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.40 reactor_0 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 639821 1 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639821 1 idle 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639821 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639821 -w 256 00:30:13.610 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639833 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1' 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639833 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_1 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 639821 2 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639821 2 idle 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639821 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639821 -w 256 00:30:13.868 10:37:50 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:13.868 10:37:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639834 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2' 00:30:13.868 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639834 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.00 reactor_2 00:30:13.868 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:13.868 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:30:14.124 [2024-07-15 10:37:51.222145] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:30:14.124 [2024-07-15 10:37:51.222348] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:30:14.124 [2024-07-15 10:37:51.222527] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:14.124 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:30:14.380 [2024-07-15 10:37:51.462593] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:30:14.380 [2024-07-15 10:37:51.462777] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 639821 0 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 639821 0 busy 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639821 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639821 -w 256 00:30:14.380 10:37:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639821 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.83 reactor_0' 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639821 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.83 reactor_0 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 639821 2 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 639821 2 busy 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639821 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639821 -w 256 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639834 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2' 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639834 root 20 0 128.2g 36288 23040 R 99.9 0.0 0:00.36 reactor_2 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:14.636 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:14.892 10:37:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:30:14.892 10:37:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:30:14.892 10:37:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:30:14.892 10:37:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:30:14.892 10:37:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:30:14.892 10:37:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:14.892 10:37:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:30:14.892 [2024-07-15 10:37:52.064296] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:30:14.892 [2024-07-15 10:37:52.064390] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 639821 2 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639821 2 idle 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639821 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:14.892 10:37:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639821 -w 256 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639834 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2' 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639834 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:00.60 reactor_2 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:15.148 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:30:15.404 [2024-07-15 10:37:52.425219] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:30:15.404 [2024-07-15 10:37:52.425440] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:30:15.404 [2024-07-15 10:37:52.425462] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:30:15.404 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 639821 0 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 639821 0 idle 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=639821 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 639821 -w 256 00:30:15.405 10:37:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor=' 639821 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.60 reactor_0' 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 639821 root 20 0 128.2g 36288 23040 S 0.0 0.0 0:01.60 reactor_0 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:30:15.662 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 639821 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 639821 ']' 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 639821 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 639821 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 639821' 00:30:15.662 killing process with pid 639821 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 639821 00:30:15.662 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 639821 00:30:15.920 10:37:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:30:15.920 10:37:52 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:15.920 00:30:15.920 real 0m10.277s 00:30:15.920 user 0m9.441s 00:30:15.920 sys 0m2.347s 00:30:15.920 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:15.920 10:37:52 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:30:15.920 ************************************ 00:30:15.920 END TEST reactor_set_interrupt 00:30:15.920 ************************************ 00:30:15.920 10:37:53 -- common/autotest_common.sh@1142 -- # return 0 00:30:15.920 10:37:53 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:15.920 10:37:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:30:15.920 10:37:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:15.920 10:37:53 -- common/autotest_common.sh@10 -- # set +x 00:30:15.920 ************************************ 00:30:15.920 START TEST reap_unregistered_poller 00:30:15.920 ************************************ 00:30:15.920 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:16.193 * Looking for test storage... 00:30:16.193 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.193 10:37:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.193 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.193 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:30:16.193 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:16.193 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:30:16.193 10:37:53 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:30:16.193 #define SPDK_CONFIG_H 00:30:16.193 #define SPDK_CONFIG_APPS 1 00:30:16.193 #define SPDK_CONFIG_ARCH native 00:30:16.193 #undef SPDK_CONFIG_ASAN 00:30:16.193 #undef SPDK_CONFIG_AVAHI 00:30:16.193 #undef SPDK_CONFIG_CET 00:30:16.193 #define SPDK_CONFIG_COVERAGE 1 00:30:16.193 #define SPDK_CONFIG_CROSS_PREFIX 00:30:16.193 #define SPDK_CONFIG_CRYPTO 1 00:30:16.193 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:30:16.193 #undef SPDK_CONFIG_CUSTOMOCF 00:30:16.193 #undef SPDK_CONFIG_DAOS 00:30:16.193 #define SPDK_CONFIG_DAOS_DIR 00:30:16.193 #define SPDK_CONFIG_DEBUG 1 00:30:16.193 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:30:16.193 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:30:16.193 #define SPDK_CONFIG_DPDK_INC_DIR 00:30:16.193 #define SPDK_CONFIG_DPDK_LIB_DIR 00:30:16.193 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:30:16.193 #undef SPDK_CONFIG_DPDK_UADK 00:30:16.193 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:30:16.193 #define SPDK_CONFIG_EXAMPLES 1 00:30:16.193 #undef SPDK_CONFIG_FC 00:30:16.193 #define SPDK_CONFIG_FC_PATH 00:30:16.193 #define SPDK_CONFIG_FIO_PLUGIN 1 00:30:16.193 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:30:16.193 #undef SPDK_CONFIG_FUSE 00:30:16.193 #undef SPDK_CONFIG_FUZZER 00:30:16.193 #define SPDK_CONFIG_FUZZER_LIB 00:30:16.193 #undef SPDK_CONFIG_GOLANG 00:30:16.193 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:30:16.193 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:30:16.193 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:30:16.193 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:30:16.193 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:30:16.193 #undef SPDK_CONFIG_HAVE_LIBBSD 00:30:16.193 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:30:16.193 #define SPDK_CONFIG_IDXD 1 00:30:16.193 #define SPDK_CONFIG_IDXD_KERNEL 1 00:30:16.193 #define SPDK_CONFIG_IPSEC_MB 1 00:30:16.193 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:30:16.193 #define SPDK_CONFIG_ISAL 1 00:30:16.193 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:30:16.193 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:30:16.193 #define SPDK_CONFIG_LIBDIR 00:30:16.193 #undef SPDK_CONFIG_LTO 00:30:16.193 #define SPDK_CONFIG_MAX_LCORES 128 00:30:16.193 #define SPDK_CONFIG_NVME_CUSE 1 00:30:16.193 #undef SPDK_CONFIG_OCF 00:30:16.193 #define SPDK_CONFIG_OCF_PATH 00:30:16.193 #define SPDK_CONFIG_OPENSSL_PATH 00:30:16.193 #undef SPDK_CONFIG_PGO_CAPTURE 00:30:16.193 #define SPDK_CONFIG_PGO_DIR 00:30:16.193 #undef SPDK_CONFIG_PGO_USE 00:30:16.193 #define SPDK_CONFIG_PREFIX /usr/local 00:30:16.193 #undef SPDK_CONFIG_RAID5F 00:30:16.193 #undef SPDK_CONFIG_RBD 00:30:16.193 #define SPDK_CONFIG_RDMA 1 00:30:16.193 #define SPDK_CONFIG_RDMA_PROV verbs 00:30:16.193 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:30:16.193 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:30:16.193 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:30:16.193 #define SPDK_CONFIG_SHARED 1 00:30:16.193 #undef SPDK_CONFIG_SMA 00:30:16.193 #define SPDK_CONFIG_TESTS 1 00:30:16.193 #undef SPDK_CONFIG_TSAN 00:30:16.193 #define SPDK_CONFIG_UBLK 1 00:30:16.193 #define SPDK_CONFIG_UBSAN 1 00:30:16.193 #undef SPDK_CONFIG_UNIT_TESTS 00:30:16.193 #undef SPDK_CONFIG_URING 00:30:16.193 #define SPDK_CONFIG_URING_PATH 00:30:16.193 #undef SPDK_CONFIG_URING_ZNS 00:30:16.193 #undef SPDK_CONFIG_USDT 00:30:16.193 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:30:16.193 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:30:16.193 #undef SPDK_CONFIG_VFIO_USER 00:30:16.193 #define SPDK_CONFIG_VFIO_USER_DIR 00:30:16.193 #define SPDK_CONFIG_VHOST 1 00:30:16.193 #define SPDK_CONFIG_VIRTIO 1 00:30:16.193 #undef SPDK_CONFIG_VTUNE 00:30:16.193 #define SPDK_CONFIG_VTUNE_DIR 00:30:16.193 #define SPDK_CONFIG_WERROR 1 00:30:16.193 #define SPDK_CONFIG_WPDK_DIR 00:30:16.193 #undef SPDK_CONFIG_XNVME 00:30:16.193 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:30:16.193 10:37:53 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:30:16.193 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:16.193 10:37:53 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:16.193 10:37:53 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:16.193 10:37:53 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:16.193 10:37:53 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:16.194 10:37:53 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:16.194 10:37:53 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:30:16.194 10:37:53 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:30:16.194 10:37:53 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:16.194 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 640619 ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 640619 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.oMWcjS 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.oMWcjS/tests/interrupt /tmp/spdk.oMWcjS 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=946290688 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4338139136 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=88758636544 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508515328 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5749878784 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249547264 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892275712 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901704704 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9428992 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253438464 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254257664 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=819200 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450844160 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450848256 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:30:16.195 * Looking for test storage... 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=88758636544 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7964471296 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.195 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=640663 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:16.195 10:37:53 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 640663 /var/tmp/spdk.sock 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 640663 ']' 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:16.195 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:16.196 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:16.196 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:16.196 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:16.196 10:37:53 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:16.196 [2024-07-15 10:37:53.334666] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:16.196 [2024-07-15 10:37:53.334724] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid640663 ] 00:30:16.455 [2024-07-15 10:37:53.446912] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:16.455 [2024-07-15 10:37:53.552547] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:16.455 [2024-07-15 10:37:53.552633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:16.455 [2024-07-15 10:37:53.552637] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:16.455 [2024-07-15 10:37:53.623851] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:30:17.387 10:37:54 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:17.387 10:37:54 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:30:17.387 10:37:54 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:17.387 10:37:54 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:17.387 10:37:54 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:30:17.387 "name": "app_thread", 00:30:17.387 "id": 1, 00:30:17.387 "active_pollers": [], 00:30:17.387 "timed_pollers": [ 00:30:17.387 { 00:30:17.387 "name": "rpc_subsystem_poll_servers", 00:30:17.387 "id": 1, 00:30:17.387 "state": "waiting", 00:30:17.387 "run_count": 0, 00:30:17.387 "busy_count": 0, 00:30:17.387 "period_ticks": 9200000 00:30:17.387 } 00:30:17.387 ], 00:30:17.387 "paused_pollers": [] 00:30:17.387 }' 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:30:17.387 10:37:54 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:30:17.387 5000+0 records in 00:30:17.387 5000+0 records out 00:30:17.387 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0270988 s, 378 MB/s 00:30:17.388 10:37:54 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:30:17.645 AIO0 00:30:17.645 10:37:54 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:17.903 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:30:18.162 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:18.162 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:18.162 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:30:18.162 "name": "app_thread", 00:30:18.162 "id": 1, 00:30:18.162 "active_pollers": [], 00:30:18.162 "timed_pollers": [ 00:30:18.162 { 00:30:18.162 "name": "rpc_subsystem_poll_servers", 00:30:18.162 "id": 1, 00:30:18.162 "state": "waiting", 00:30:18.162 "run_count": 0, 00:30:18.162 "busy_count": 0, 00:30:18.162 "period_ticks": 9200000 00:30:18.162 } 00:30:18.162 ], 00:30:18.162 "paused_pollers": [] 00:30:18.162 }' 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:30:18.162 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 640663 00:30:18.162 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 640663 ']' 00:30:18.162 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 640663 00:30:18.162 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:30:18.163 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:18.163 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 640663 00:30:18.163 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:18.163 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:18.163 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 640663' 00:30:18.163 killing process with pid 640663 00:30:18.163 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 640663 00:30:18.163 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 640663 00:30:18.420 10:37:55 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:30:18.420 10:37:55 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:30:18.420 00:30:18.420 real 0m2.498s 00:30:18.420 user 0m1.594s 00:30:18.420 sys 0m0.629s 00:30:18.420 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:18.420 10:37:55 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:30:18.420 ************************************ 00:30:18.420 END TEST reap_unregistered_poller 00:30:18.420 ************************************ 00:30:18.420 10:37:55 -- common/autotest_common.sh@1142 -- # return 0 00:30:18.420 10:37:55 -- spdk/autotest.sh@198 -- # uname -s 00:30:18.420 10:37:55 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:30:18.420 10:37:55 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:30:18.420 10:37:55 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:30:18.420 10:37:55 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@260 -- # timing_exit lib 00:30:18.420 10:37:55 -- common/autotest_common.sh@728 -- # xtrace_disable 00:30:18.420 10:37:55 -- common/autotest_common.sh@10 -- # set +x 00:30:18.420 10:37:55 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:30:18.420 10:37:55 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:18.420 10:37:55 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:18.420 10:37:55 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:18.420 10:37:55 -- common/autotest_common.sh@10 -- # set +x 00:30:18.679 ************************************ 00:30:18.679 START TEST compress_compdev 00:30:18.679 ************************************ 00:30:18.679 10:37:55 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:30:18.679 * Looking for test storage... 00:30:18.679 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:30:18.679 10:37:55 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:30:18.679 10:37:55 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:30:18.679 10:37:55 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:30:18.679 10:37:55 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:18.679 10:37:55 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:18.679 10:37:55 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:18.679 10:37:55 compress_compdev -- paths/export.sh@5 -- # export PATH 00:30:18.679 10:37:55 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:30:18.679 10:37:55 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=641100 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 641100 00:30:18.679 10:37:55 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 641100 ']' 00:30:18.679 10:37:55 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:18.679 10:37:55 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:18.679 10:37:55 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:18.679 10:37:55 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:18.679 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:18.679 10:37:55 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:18.679 10:37:55 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:18.679 [2024-07-15 10:37:55.858989] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:18.679 [2024-07-15 10:37:55.859060] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid641100 ] 00:30:18.937 [2024-07-15 10:37:55.981689] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:18.937 [2024-07-15 10:37:56.082938] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:18.937 [2024-07-15 10:37:56.082962] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:19.871 [2024-07-15 10:37:56.822623] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:19.871 10:37:56 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:19.871 10:37:56 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:19.871 10:37:56 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:30:19.871 10:37:56 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:19.871 10:37:56 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:20.436 [2024-07-15 10:37:57.407174] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22653c0 PMD being used: compress_qat 00:30:20.436 10:37:57 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:20.436 10:37:57 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:20.436 10:37:57 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:20.436 10:37:57 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:20.436 10:37:57 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:20.436 10:37:57 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:20.436 10:37:57 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:20.436 10:37:57 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:20.694 [ 00:30:20.694 { 00:30:20.694 "name": "Nvme0n1", 00:30:20.694 "aliases": [ 00:30:20.694 "01000000-0000-0000-5cd2-e43197705251" 00:30:20.694 ], 00:30:20.694 "product_name": "NVMe disk", 00:30:20.694 "block_size": 512, 00:30:20.694 "num_blocks": 15002931888, 00:30:20.694 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:20.694 "assigned_rate_limits": { 00:30:20.694 "rw_ios_per_sec": 0, 00:30:20.694 "rw_mbytes_per_sec": 0, 00:30:20.694 "r_mbytes_per_sec": 0, 00:30:20.694 "w_mbytes_per_sec": 0 00:30:20.694 }, 00:30:20.694 "claimed": false, 00:30:20.694 "zoned": false, 00:30:20.694 "supported_io_types": { 00:30:20.694 "read": true, 00:30:20.694 "write": true, 00:30:20.694 "unmap": true, 00:30:20.694 "flush": true, 00:30:20.694 "reset": true, 00:30:20.694 "nvme_admin": true, 00:30:20.694 "nvme_io": true, 00:30:20.694 "nvme_io_md": false, 00:30:20.694 "write_zeroes": true, 00:30:20.694 "zcopy": false, 00:30:20.694 "get_zone_info": false, 00:30:20.694 "zone_management": false, 00:30:20.694 "zone_append": false, 00:30:20.694 "compare": false, 00:30:20.694 "compare_and_write": false, 00:30:20.694 "abort": true, 00:30:20.694 "seek_hole": false, 00:30:20.694 "seek_data": false, 00:30:20.694 "copy": false, 00:30:20.694 "nvme_iov_md": false 00:30:20.694 }, 00:30:20.694 "driver_specific": { 00:30:20.694 "nvme": [ 00:30:20.694 { 00:30:20.694 "pci_address": "0000:5e:00.0", 00:30:20.694 "trid": { 00:30:20.694 "trtype": "PCIe", 00:30:20.694 "traddr": "0000:5e:00.0" 00:30:20.694 }, 00:30:20.694 "ctrlr_data": { 00:30:20.694 "cntlid": 0, 00:30:20.694 "vendor_id": "0x8086", 00:30:20.694 "model_number": "INTEL SSDPF2KX076TZO", 00:30:20.694 "serial_number": "PHAC0301002G7P6CGN", 00:30:20.694 "firmware_revision": "JCV10200", 00:30:20.694 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:20.694 "oacs": { 00:30:20.694 "security": 1, 00:30:20.694 "format": 1, 00:30:20.694 "firmware": 1, 00:30:20.694 "ns_manage": 1 00:30:20.694 }, 00:30:20.694 "multi_ctrlr": false, 00:30:20.694 "ana_reporting": false 00:30:20.694 }, 00:30:20.694 "vs": { 00:30:20.694 "nvme_version": "1.3" 00:30:20.694 }, 00:30:20.694 "ns_data": { 00:30:20.694 "id": 1, 00:30:20.694 "can_share": false 00:30:20.694 }, 00:30:20.694 "security": { 00:30:20.694 "opal": true 00:30:20.694 } 00:30:20.694 } 00:30:20.694 ], 00:30:20.694 "mp_policy": "active_passive" 00:30:20.694 } 00:30:20.694 } 00:30:20.694 ] 00:30:20.694 10:37:57 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:20.694 10:37:57 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:20.951 [2024-07-15 10:37:57.952262] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20ca0d0 PMD being used: compress_qat 00:30:23.479 44f302b3-a44c-4d50-86eb-5d36214c4a45 00:30:23.479 10:38:00 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:23.479 13531e6d-532e-4721-b77e-2406f233d428 00:30:23.479 10:38:00 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:23.479 10:38:00 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:23.479 10:38:00 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:23.479 10:38:00 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:23.479 10:38:00 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:23.479 10:38:00 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:23.479 10:38:00 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:23.479 10:38:00 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:23.479 [ 00:30:23.479 { 00:30:23.479 "name": "13531e6d-532e-4721-b77e-2406f233d428", 00:30:23.479 "aliases": [ 00:30:23.479 "lvs0/lv0" 00:30:23.479 ], 00:30:23.479 "product_name": "Logical Volume", 00:30:23.479 "block_size": 512, 00:30:23.479 "num_blocks": 204800, 00:30:23.479 "uuid": "13531e6d-532e-4721-b77e-2406f233d428", 00:30:23.479 "assigned_rate_limits": { 00:30:23.479 "rw_ios_per_sec": 0, 00:30:23.479 "rw_mbytes_per_sec": 0, 00:30:23.479 "r_mbytes_per_sec": 0, 00:30:23.479 "w_mbytes_per_sec": 0 00:30:23.479 }, 00:30:23.479 "claimed": false, 00:30:23.479 "zoned": false, 00:30:23.479 "supported_io_types": { 00:30:23.479 "read": true, 00:30:23.479 "write": true, 00:30:23.479 "unmap": true, 00:30:23.479 "flush": false, 00:30:23.479 "reset": true, 00:30:23.479 "nvme_admin": false, 00:30:23.479 "nvme_io": false, 00:30:23.479 "nvme_io_md": false, 00:30:23.479 "write_zeroes": true, 00:30:23.479 "zcopy": false, 00:30:23.479 "get_zone_info": false, 00:30:23.479 "zone_management": false, 00:30:23.479 "zone_append": false, 00:30:23.479 "compare": false, 00:30:23.479 "compare_and_write": false, 00:30:23.479 "abort": false, 00:30:23.479 "seek_hole": true, 00:30:23.479 "seek_data": true, 00:30:23.479 "copy": false, 00:30:23.479 "nvme_iov_md": false 00:30:23.479 }, 00:30:23.479 "driver_specific": { 00:30:23.479 "lvol": { 00:30:23.479 "lvol_store_uuid": "44f302b3-a44c-4d50-86eb-5d36214c4a45", 00:30:23.479 "base_bdev": "Nvme0n1", 00:30:23.479 "thin_provision": true, 00:30:23.479 "num_allocated_clusters": 0, 00:30:23.479 "snapshot": false, 00:30:23.479 "clone": false, 00:30:23.479 "esnap_clone": false 00:30:23.479 } 00:30:23.480 } 00:30:23.480 } 00:30:23.480 ] 00:30:23.480 10:38:00 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:23.480 10:38:00 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:23.480 10:38:00 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:23.737 [2024-07-15 10:38:00.826158] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:23.737 COMP_lvs0/lv0 00:30:23.737 10:38:00 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:23.737 10:38:00 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:23.737 10:38:00 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:23.737 10:38:00 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:23.738 10:38:00 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:23.738 10:38:00 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:23.738 10:38:00 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:23.996 10:38:01 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:24.254 [ 00:30:24.254 { 00:30:24.254 "name": "COMP_lvs0/lv0", 00:30:24.254 "aliases": [ 00:30:24.254 "59edc3c7-2555-53d3-b162-ff6169382a8b" 00:30:24.254 ], 00:30:24.254 "product_name": "compress", 00:30:24.254 "block_size": 512, 00:30:24.254 "num_blocks": 200704, 00:30:24.254 "uuid": "59edc3c7-2555-53d3-b162-ff6169382a8b", 00:30:24.254 "assigned_rate_limits": { 00:30:24.255 "rw_ios_per_sec": 0, 00:30:24.255 "rw_mbytes_per_sec": 0, 00:30:24.255 "r_mbytes_per_sec": 0, 00:30:24.255 "w_mbytes_per_sec": 0 00:30:24.255 }, 00:30:24.255 "claimed": false, 00:30:24.255 "zoned": false, 00:30:24.255 "supported_io_types": { 00:30:24.255 "read": true, 00:30:24.255 "write": true, 00:30:24.255 "unmap": false, 00:30:24.255 "flush": false, 00:30:24.255 "reset": false, 00:30:24.255 "nvme_admin": false, 00:30:24.255 "nvme_io": false, 00:30:24.255 "nvme_io_md": false, 00:30:24.255 "write_zeroes": true, 00:30:24.255 "zcopy": false, 00:30:24.255 "get_zone_info": false, 00:30:24.255 "zone_management": false, 00:30:24.255 "zone_append": false, 00:30:24.255 "compare": false, 00:30:24.255 "compare_and_write": false, 00:30:24.255 "abort": false, 00:30:24.255 "seek_hole": false, 00:30:24.255 "seek_data": false, 00:30:24.255 "copy": false, 00:30:24.255 "nvme_iov_md": false 00:30:24.255 }, 00:30:24.255 "driver_specific": { 00:30:24.255 "compress": { 00:30:24.255 "name": "COMP_lvs0/lv0", 00:30:24.255 "base_bdev_name": "13531e6d-532e-4721-b77e-2406f233d428" 00:30:24.255 } 00:30:24.255 } 00:30:24.255 } 00:30:24.255 ] 00:30:24.255 10:38:01 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:24.255 10:38:01 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:24.255 [2024-07-15 10:38:01.368259] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f444c1b15c0 PMD being used: compress_qat 00:30:24.255 [2024-07-15 10:38:01.370451] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2262670 PMD being used: compress_qat 00:30:24.255 Running I/O for 3 seconds... 00:30:27.537 00:30:27.537 Latency(us) 00:30:27.537 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.537 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:27.537 Verification LBA range: start 0x0 length 0x3100 00:30:27.537 COMP_lvs0/lv0 : 3.00 5147.28 20.11 0.00 0.00 6165.71 584.13 5556.31 00:30:27.537 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:27.537 Verification LBA range: start 0x3100 length 0x3100 00:30:27.537 COMP_lvs0/lv0 : 3.00 5424.23 21.19 0.00 0.00 5863.24 416.72 5613.30 00:30:27.537 =================================================================================================================== 00:30:27.537 Total : 10571.51 41.29 0.00 0.00 6010.51 416.72 5613.30 00:30:27.537 0 00:30:27.537 10:38:04 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:27.537 10:38:04 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:27.538 10:38:04 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:27.796 10:38:04 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:27.796 10:38:04 compress_compdev -- compress/compress.sh@78 -- # killprocess 641100 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 641100 ']' 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 641100 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 641100 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 641100' 00:30:27.796 killing process with pid 641100 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@967 -- # kill 641100 00:30:27.796 Received shutdown signal, test time was about 3.000000 seconds 00:30:27.796 00:30:27.796 Latency(us) 00:30:27.796 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:27.796 =================================================================================================================== 00:30:27.796 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:27.796 10:38:04 compress_compdev -- common/autotest_common.sh@972 -- # wait 641100 00:30:31.079 10:38:07 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:31.079 10:38:07 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:31.079 10:38:07 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=642536 00:30:31.079 10:38:07 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:31.079 10:38:07 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:31.079 10:38:07 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 642536 00:30:31.079 10:38:07 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 642536 ']' 00:30:31.079 10:38:07 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:31.079 10:38:07 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:31.079 10:38:07 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:31.079 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:31.079 10:38:07 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:31.079 10:38:07 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:31.079 [2024-07-15 10:38:07.886963] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:31.079 [2024-07-15 10:38:07.887020] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid642536 ] 00:30:31.079 [2024-07-15 10:38:07.989756] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:31.079 [2024-07-15 10:38:08.089398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:31.079 [2024-07-15 10:38:08.089404] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:32.013 [2024-07-15 10:38:08.855865] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:32.013 10:38:08 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:32.013 10:38:08 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:32.013 10:38:08 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:30:32.013 10:38:08 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:32.013 10:38:08 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:32.270 [2024-07-15 10:38:09.430830] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10bd3c0 PMD being used: compress_qat 00:30:32.270 10:38:09 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:32.270 10:38:09 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:32.270 10:38:09 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:32.270 10:38:09 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:32.270 10:38:09 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:32.270 10:38:09 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:32.270 10:38:09 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:32.528 10:38:09 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:32.786 [ 00:30:32.786 { 00:30:32.786 "name": "Nvme0n1", 00:30:32.786 "aliases": [ 00:30:32.786 "01000000-0000-0000-5cd2-e43197705251" 00:30:32.786 ], 00:30:32.786 "product_name": "NVMe disk", 00:30:32.786 "block_size": 512, 00:30:32.786 "num_blocks": 15002931888, 00:30:32.786 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:32.786 "assigned_rate_limits": { 00:30:32.786 "rw_ios_per_sec": 0, 00:30:32.786 "rw_mbytes_per_sec": 0, 00:30:32.786 "r_mbytes_per_sec": 0, 00:30:32.786 "w_mbytes_per_sec": 0 00:30:32.786 }, 00:30:32.786 "claimed": false, 00:30:32.786 "zoned": false, 00:30:32.786 "supported_io_types": { 00:30:32.786 "read": true, 00:30:32.786 "write": true, 00:30:32.786 "unmap": true, 00:30:32.786 "flush": true, 00:30:32.786 "reset": true, 00:30:32.786 "nvme_admin": true, 00:30:32.786 "nvme_io": true, 00:30:32.786 "nvme_io_md": false, 00:30:32.786 "write_zeroes": true, 00:30:32.786 "zcopy": false, 00:30:32.786 "get_zone_info": false, 00:30:32.786 "zone_management": false, 00:30:32.786 "zone_append": false, 00:30:32.786 "compare": false, 00:30:32.786 "compare_and_write": false, 00:30:32.786 "abort": true, 00:30:32.786 "seek_hole": false, 00:30:32.786 "seek_data": false, 00:30:32.786 "copy": false, 00:30:32.786 "nvme_iov_md": false 00:30:32.786 }, 00:30:32.786 "driver_specific": { 00:30:32.786 "nvme": [ 00:30:32.786 { 00:30:32.786 "pci_address": "0000:5e:00.0", 00:30:32.786 "trid": { 00:30:32.786 "trtype": "PCIe", 00:30:32.786 "traddr": "0000:5e:00.0" 00:30:32.786 }, 00:30:32.786 "ctrlr_data": { 00:30:32.786 "cntlid": 0, 00:30:32.786 "vendor_id": "0x8086", 00:30:32.786 "model_number": "INTEL SSDPF2KX076TZO", 00:30:32.786 "serial_number": "PHAC0301002G7P6CGN", 00:30:32.786 "firmware_revision": "JCV10200", 00:30:32.786 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:32.786 "oacs": { 00:30:32.786 "security": 1, 00:30:32.786 "format": 1, 00:30:32.786 "firmware": 1, 00:30:32.786 "ns_manage": 1 00:30:32.786 }, 00:30:32.786 "multi_ctrlr": false, 00:30:32.786 "ana_reporting": false 00:30:32.786 }, 00:30:32.786 "vs": { 00:30:32.786 "nvme_version": "1.3" 00:30:32.786 }, 00:30:32.786 "ns_data": { 00:30:32.786 "id": 1, 00:30:32.786 "can_share": false 00:30:32.786 }, 00:30:32.786 "security": { 00:30:32.786 "opal": true 00:30:32.786 } 00:30:32.786 } 00:30:32.786 ], 00:30:32.786 "mp_policy": "active_passive" 00:30:32.786 } 00:30:32.786 } 00:30:32.786 ] 00:30:32.786 10:38:09 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:32.786 10:38:09 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:33.044 [2024-07-15 10:38:10.172536] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf22660 PMD being used: compress_qat 00:30:35.608 14b028f2-140c-4f93-ae5d-8e8a95421be6 00:30:35.608 10:38:12 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:35.608 437fb2e0-7bee-428a-a297-7146ce84262e 00:30:35.608 10:38:12 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:35.608 10:38:12 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:35.608 10:38:12 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:35.608 10:38:12 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:35.608 10:38:12 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:35.608 10:38:12 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:35.608 10:38:12 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:35.866 10:38:12 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:35.866 [ 00:30:35.866 { 00:30:35.866 "name": "437fb2e0-7bee-428a-a297-7146ce84262e", 00:30:35.866 "aliases": [ 00:30:35.866 "lvs0/lv0" 00:30:35.866 ], 00:30:35.866 "product_name": "Logical Volume", 00:30:35.866 "block_size": 512, 00:30:35.866 "num_blocks": 204800, 00:30:35.866 "uuid": "437fb2e0-7bee-428a-a297-7146ce84262e", 00:30:35.866 "assigned_rate_limits": { 00:30:35.866 "rw_ios_per_sec": 0, 00:30:35.866 "rw_mbytes_per_sec": 0, 00:30:35.866 "r_mbytes_per_sec": 0, 00:30:35.866 "w_mbytes_per_sec": 0 00:30:35.866 }, 00:30:35.866 "claimed": false, 00:30:35.866 "zoned": false, 00:30:35.866 "supported_io_types": { 00:30:35.866 "read": true, 00:30:35.866 "write": true, 00:30:35.866 "unmap": true, 00:30:35.866 "flush": false, 00:30:35.866 "reset": true, 00:30:35.866 "nvme_admin": false, 00:30:35.866 "nvme_io": false, 00:30:35.866 "nvme_io_md": false, 00:30:35.866 "write_zeroes": true, 00:30:35.866 "zcopy": false, 00:30:35.866 "get_zone_info": false, 00:30:35.866 "zone_management": false, 00:30:35.866 "zone_append": false, 00:30:35.866 "compare": false, 00:30:35.866 "compare_and_write": false, 00:30:35.866 "abort": false, 00:30:35.866 "seek_hole": true, 00:30:35.866 "seek_data": true, 00:30:35.866 "copy": false, 00:30:35.866 "nvme_iov_md": false 00:30:35.866 }, 00:30:35.866 "driver_specific": { 00:30:35.866 "lvol": { 00:30:35.866 "lvol_store_uuid": "14b028f2-140c-4f93-ae5d-8e8a95421be6", 00:30:35.866 "base_bdev": "Nvme0n1", 00:30:35.866 "thin_provision": true, 00:30:35.866 "num_allocated_clusters": 0, 00:30:35.866 "snapshot": false, 00:30:35.866 "clone": false, 00:30:35.866 "esnap_clone": false 00:30:35.866 } 00:30:35.866 } 00:30:35.866 } 00:30:35.866 ] 00:30:36.130 10:38:13 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:36.130 10:38:13 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:36.130 10:38:13 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:36.130 [2024-07-15 10:38:13.246752] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:36.130 COMP_lvs0/lv0 00:30:36.130 10:38:13 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:36.130 10:38:13 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:36.130 10:38:13 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:36.130 10:38:13 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:36.130 10:38:13 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:36.130 10:38:13 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:36.130 10:38:13 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:36.387 10:38:13 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:36.644 [ 00:30:36.644 { 00:30:36.644 "name": "COMP_lvs0/lv0", 00:30:36.644 "aliases": [ 00:30:36.644 "e4ad78cf-8641-51b3-8c0d-f76dddd6d026" 00:30:36.644 ], 00:30:36.644 "product_name": "compress", 00:30:36.644 "block_size": 512, 00:30:36.644 "num_blocks": 200704, 00:30:36.644 "uuid": "e4ad78cf-8641-51b3-8c0d-f76dddd6d026", 00:30:36.644 "assigned_rate_limits": { 00:30:36.644 "rw_ios_per_sec": 0, 00:30:36.644 "rw_mbytes_per_sec": 0, 00:30:36.644 "r_mbytes_per_sec": 0, 00:30:36.644 "w_mbytes_per_sec": 0 00:30:36.644 }, 00:30:36.644 "claimed": false, 00:30:36.644 "zoned": false, 00:30:36.644 "supported_io_types": { 00:30:36.644 "read": true, 00:30:36.644 "write": true, 00:30:36.644 "unmap": false, 00:30:36.644 "flush": false, 00:30:36.644 "reset": false, 00:30:36.644 "nvme_admin": false, 00:30:36.644 "nvme_io": false, 00:30:36.644 "nvme_io_md": false, 00:30:36.644 "write_zeroes": true, 00:30:36.644 "zcopy": false, 00:30:36.644 "get_zone_info": false, 00:30:36.644 "zone_management": false, 00:30:36.644 "zone_append": false, 00:30:36.644 "compare": false, 00:30:36.644 "compare_and_write": false, 00:30:36.644 "abort": false, 00:30:36.644 "seek_hole": false, 00:30:36.644 "seek_data": false, 00:30:36.644 "copy": false, 00:30:36.644 "nvme_iov_md": false 00:30:36.644 }, 00:30:36.644 "driver_specific": { 00:30:36.644 "compress": { 00:30:36.644 "name": "COMP_lvs0/lv0", 00:30:36.644 "base_bdev_name": "437fb2e0-7bee-428a-a297-7146ce84262e" 00:30:36.644 } 00:30:36.644 } 00:30:36.644 } 00:30:36.644 ] 00:30:36.644 10:38:13 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:36.644 10:38:13 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:36.644 [2024-07-15 10:38:13.760760] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7efdf41b15c0 PMD being used: compress_qat 00:30:36.644 [2024-07-15 10:38:13.762996] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10ba770 PMD being used: compress_qat 00:30:36.644 Running I/O for 3 seconds... 00:30:39.921 00:30:39.921 Latency(us) 00:30:39.921 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:39.921 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:39.921 Verification LBA range: start 0x0 length 0x3100 00:30:39.921 COMP_lvs0/lv0 : 3.00 5145.47 20.10 0.00 0.00 6168.38 527.14 5784.26 00:30:39.921 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:39.921 Verification LBA range: start 0x3100 length 0x3100 00:30:39.921 COMP_lvs0/lv0 : 3.00 5420.10 21.17 0.00 0.00 5868.06 443.44 5499.33 00:30:39.921 =================================================================================================================== 00:30:39.921 Total : 10565.57 41.27 0.00 0.00 6014.31 443.44 5784.26 00:30:39.921 0 00:30:39.921 10:38:16 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:39.921 10:38:16 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:39.921 10:38:16 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:40.179 10:38:17 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:40.179 10:38:17 compress_compdev -- compress/compress.sh@78 -- # killprocess 642536 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 642536 ']' 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 642536 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 642536 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 642536' 00:30:40.179 killing process with pid 642536 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@967 -- # kill 642536 00:30:40.179 Received shutdown signal, test time was about 3.000000 seconds 00:30:40.179 00:30:40.179 Latency(us) 00:30:40.179 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:40.179 =================================================================================================================== 00:30:40.179 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:40.179 10:38:17 compress_compdev -- common/autotest_common.sh@972 -- # wait 642536 00:30:43.459 10:38:20 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:43.459 10:38:20 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:43.459 10:38:20 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=644135 00:30:43.459 10:38:20 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:43.459 10:38:20 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:30:43.459 10:38:20 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 644135 00:30:43.459 10:38:20 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 644135 ']' 00:30:43.460 10:38:20 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:43.460 10:38:20 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:43.460 10:38:20 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:43.460 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:43.460 10:38:20 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:43.460 10:38:20 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:43.460 [2024-07-15 10:38:20.371060] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:43.460 [2024-07-15 10:38:20.371131] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid644135 ] 00:30:43.460 [2024-07-15 10:38:20.490387] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:43.460 [2024-07-15 10:38:20.597921] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:43.460 [2024-07-15 10:38:20.597936] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:44.394 [2024-07-15 10:38:21.363187] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:44.394 10:38:21 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:44.394 10:38:21 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:44.394 10:38:21 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:30:44.394 10:38:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:44.394 10:38:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:44.961 [2024-07-15 10:38:22.006914] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x297c3c0 PMD being used: compress_qat 00:30:44.961 10:38:22 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:44.961 10:38:22 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:44.961 10:38:22 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:44.961 10:38:22 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:44.961 10:38:22 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:44.961 10:38:22 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:44.961 10:38:22 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:45.219 10:38:22 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:45.478 [ 00:30:45.478 { 00:30:45.478 "name": "Nvme0n1", 00:30:45.478 "aliases": [ 00:30:45.478 "01000000-0000-0000-5cd2-e43197705251" 00:30:45.478 ], 00:30:45.478 "product_name": "NVMe disk", 00:30:45.478 "block_size": 512, 00:30:45.478 "num_blocks": 15002931888, 00:30:45.478 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:45.478 "assigned_rate_limits": { 00:30:45.478 "rw_ios_per_sec": 0, 00:30:45.478 "rw_mbytes_per_sec": 0, 00:30:45.478 "r_mbytes_per_sec": 0, 00:30:45.478 "w_mbytes_per_sec": 0 00:30:45.478 }, 00:30:45.478 "claimed": false, 00:30:45.478 "zoned": false, 00:30:45.478 "supported_io_types": { 00:30:45.478 "read": true, 00:30:45.478 "write": true, 00:30:45.478 "unmap": true, 00:30:45.478 "flush": true, 00:30:45.478 "reset": true, 00:30:45.478 "nvme_admin": true, 00:30:45.478 "nvme_io": true, 00:30:45.478 "nvme_io_md": false, 00:30:45.478 "write_zeroes": true, 00:30:45.478 "zcopy": false, 00:30:45.478 "get_zone_info": false, 00:30:45.478 "zone_management": false, 00:30:45.478 "zone_append": false, 00:30:45.478 "compare": false, 00:30:45.478 "compare_and_write": false, 00:30:45.478 "abort": true, 00:30:45.478 "seek_hole": false, 00:30:45.478 "seek_data": false, 00:30:45.478 "copy": false, 00:30:45.478 "nvme_iov_md": false 00:30:45.478 }, 00:30:45.478 "driver_specific": { 00:30:45.478 "nvme": [ 00:30:45.478 { 00:30:45.478 "pci_address": "0000:5e:00.0", 00:30:45.478 "trid": { 00:30:45.478 "trtype": "PCIe", 00:30:45.478 "traddr": "0000:5e:00.0" 00:30:45.478 }, 00:30:45.478 "ctrlr_data": { 00:30:45.478 "cntlid": 0, 00:30:45.478 "vendor_id": "0x8086", 00:30:45.478 "model_number": "INTEL SSDPF2KX076TZO", 00:30:45.478 "serial_number": "PHAC0301002G7P6CGN", 00:30:45.478 "firmware_revision": "JCV10200", 00:30:45.478 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:45.478 "oacs": { 00:30:45.478 "security": 1, 00:30:45.478 "format": 1, 00:30:45.478 "firmware": 1, 00:30:45.478 "ns_manage": 1 00:30:45.478 }, 00:30:45.478 "multi_ctrlr": false, 00:30:45.478 "ana_reporting": false 00:30:45.478 }, 00:30:45.478 "vs": { 00:30:45.478 "nvme_version": "1.3" 00:30:45.478 }, 00:30:45.478 "ns_data": { 00:30:45.478 "id": 1, 00:30:45.478 "can_share": false 00:30:45.478 }, 00:30:45.478 "security": { 00:30:45.478 "opal": true 00:30:45.478 } 00:30:45.478 } 00:30:45.478 ], 00:30:45.478 "mp_policy": "active_passive" 00:30:45.478 } 00:30:45.478 } 00:30:45.478 ] 00:30:45.478 10:38:22 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:45.478 10:38:22 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:45.737 [2024-07-15 10:38:22.772633] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x27e10d0 PMD being used: compress_qat 00:30:48.267 b1992b04-e1ce-4fcf-b073-59bbea4b828d 00:30:48.268 10:38:24 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:48.268 8018a8f4-45db-4fad-aa8c-ec5b7d9edbc4 00:30:48.268 10:38:25 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:48.268 10:38:25 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:48.268 10:38:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:48.268 10:38:25 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:48.268 10:38:25 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:48.268 10:38:25 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:48.268 10:38:25 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:48.525 10:38:25 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:48.525 [ 00:30:48.525 { 00:30:48.525 "name": "8018a8f4-45db-4fad-aa8c-ec5b7d9edbc4", 00:30:48.525 "aliases": [ 00:30:48.525 "lvs0/lv0" 00:30:48.525 ], 00:30:48.525 "product_name": "Logical Volume", 00:30:48.525 "block_size": 512, 00:30:48.525 "num_blocks": 204800, 00:30:48.525 "uuid": "8018a8f4-45db-4fad-aa8c-ec5b7d9edbc4", 00:30:48.525 "assigned_rate_limits": { 00:30:48.525 "rw_ios_per_sec": 0, 00:30:48.525 "rw_mbytes_per_sec": 0, 00:30:48.525 "r_mbytes_per_sec": 0, 00:30:48.525 "w_mbytes_per_sec": 0 00:30:48.525 }, 00:30:48.525 "claimed": false, 00:30:48.525 "zoned": false, 00:30:48.525 "supported_io_types": { 00:30:48.525 "read": true, 00:30:48.525 "write": true, 00:30:48.525 "unmap": true, 00:30:48.525 "flush": false, 00:30:48.525 "reset": true, 00:30:48.525 "nvme_admin": false, 00:30:48.525 "nvme_io": false, 00:30:48.525 "nvme_io_md": false, 00:30:48.525 "write_zeroes": true, 00:30:48.525 "zcopy": false, 00:30:48.525 "get_zone_info": false, 00:30:48.525 "zone_management": false, 00:30:48.525 "zone_append": false, 00:30:48.525 "compare": false, 00:30:48.525 "compare_and_write": false, 00:30:48.525 "abort": false, 00:30:48.525 "seek_hole": true, 00:30:48.526 "seek_data": true, 00:30:48.526 "copy": false, 00:30:48.526 "nvme_iov_md": false 00:30:48.526 }, 00:30:48.526 "driver_specific": { 00:30:48.526 "lvol": { 00:30:48.526 "lvol_store_uuid": "b1992b04-e1ce-4fcf-b073-59bbea4b828d", 00:30:48.526 "base_bdev": "Nvme0n1", 00:30:48.526 "thin_provision": true, 00:30:48.526 "num_allocated_clusters": 0, 00:30:48.526 "snapshot": false, 00:30:48.526 "clone": false, 00:30:48.526 "esnap_clone": false 00:30:48.526 } 00:30:48.526 } 00:30:48.526 } 00:30:48.526 ] 00:30:48.526 10:38:25 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:48.526 10:38:25 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:48.526 10:38:25 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:48.783 [2024-07-15 10:38:25.951126] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:48.783 COMP_lvs0/lv0 00:30:49.041 10:38:25 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:49.041 10:38:25 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:49.041 10:38:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:49.041 10:38:25 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:49.041 10:38:25 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:49.041 10:38:25 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:49.041 10:38:25 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:49.041 10:38:26 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:49.299 [ 00:30:49.299 { 00:30:49.299 "name": "COMP_lvs0/lv0", 00:30:49.299 "aliases": [ 00:30:49.299 "5fa46855-c44e-5d74-8f90-f5fce66d079a" 00:30:49.299 ], 00:30:49.299 "product_name": "compress", 00:30:49.299 "block_size": 4096, 00:30:49.299 "num_blocks": 25088, 00:30:49.299 "uuid": "5fa46855-c44e-5d74-8f90-f5fce66d079a", 00:30:49.299 "assigned_rate_limits": { 00:30:49.299 "rw_ios_per_sec": 0, 00:30:49.299 "rw_mbytes_per_sec": 0, 00:30:49.299 "r_mbytes_per_sec": 0, 00:30:49.299 "w_mbytes_per_sec": 0 00:30:49.299 }, 00:30:49.299 "claimed": false, 00:30:49.299 "zoned": false, 00:30:49.299 "supported_io_types": { 00:30:49.299 "read": true, 00:30:49.299 "write": true, 00:30:49.299 "unmap": false, 00:30:49.299 "flush": false, 00:30:49.299 "reset": false, 00:30:49.299 "nvme_admin": false, 00:30:49.299 "nvme_io": false, 00:30:49.299 "nvme_io_md": false, 00:30:49.299 "write_zeroes": true, 00:30:49.299 "zcopy": false, 00:30:49.299 "get_zone_info": false, 00:30:49.299 "zone_management": false, 00:30:49.299 "zone_append": false, 00:30:49.299 "compare": false, 00:30:49.299 "compare_and_write": false, 00:30:49.299 "abort": false, 00:30:49.299 "seek_hole": false, 00:30:49.299 "seek_data": false, 00:30:49.299 "copy": false, 00:30:49.299 "nvme_iov_md": false 00:30:49.299 }, 00:30:49.299 "driver_specific": { 00:30:49.299 "compress": { 00:30:49.299 "name": "COMP_lvs0/lv0", 00:30:49.299 "base_bdev_name": "8018a8f4-45db-4fad-aa8c-ec5b7d9edbc4" 00:30:49.299 } 00:30:49.299 } 00:30:49.299 } 00:30:49.299 ] 00:30:49.299 10:38:26 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:49.299 10:38:26 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:49.557 [2024-07-15 10:38:26.593573] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fde801b15c0 PMD being used: compress_qat 00:30:49.557 [2024-07-15 10:38:26.595755] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2979700 PMD being used: compress_qat 00:30:49.557 Running I/O for 3 seconds... 00:30:52.834 00:30:52.834 Latency(us) 00:30:52.834 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:52.834 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:52.834 Verification LBA range: start 0x0 length 0x3100 00:30:52.834 COMP_lvs0/lv0 : 3.00 5134.20 20.06 0.00 0.00 6181.10 432.75 5670.29 00:30:52.834 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:52.834 Verification LBA range: start 0x3100 length 0x3100 00:30:52.834 COMP_lvs0/lv0 : 3.00 5377.04 21.00 0.00 0.00 5914.35 306.31 5727.28 00:30:52.834 =================================================================================================================== 00:30:52.834 Total : 10511.24 41.06 0.00 0.00 6044.64 306.31 5727.28 00:30:52.834 0 00:30:52.834 10:38:29 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:30:52.834 10:38:29 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:52.834 10:38:29 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:53.092 10:38:30 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:53.092 10:38:30 compress_compdev -- compress/compress.sh@78 -- # killprocess 644135 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 644135 ']' 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 644135 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 644135 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 644135' 00:30:53.092 killing process with pid 644135 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@967 -- # kill 644135 00:30:53.092 Received shutdown signal, test time was about 3.000000 seconds 00:30:53.092 00:30:53.092 Latency(us) 00:30:53.092 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:53.092 =================================================================================================================== 00:30:53.092 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:53.092 10:38:30 compress_compdev -- common/autotest_common.sh@972 -- # wait 644135 00:30:56.373 10:38:33 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:30:56.373 10:38:33 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:30:56.373 10:38:33 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=645790 00:30:56.373 10:38:33 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:56.373 10:38:33 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:30:56.373 10:38:33 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 645790 00:30:56.373 10:38:33 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 645790 ']' 00:30:56.373 10:38:33 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:56.373 10:38:33 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:56.373 10:38:33 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:56.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:56.373 10:38:33 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:56.373 10:38:33 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:30:56.373 [2024-07-15 10:38:33.259045] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:30:56.373 [2024-07-15 10:38:33.259119] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid645790 ] 00:30:56.373 [2024-07-15 10:38:33.390776] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:56.373 [2024-07-15 10:38:33.493178] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:56.373 [2024-07-15 10:38:33.493261] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:56.373 [2024-07-15 10:38:33.493267] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:57.309 [2024-07-15 10:38:34.235063] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:30:57.309 10:38:34 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:57.309 10:38:34 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:30:57.309 10:38:34 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:30:57.309 10:38:34 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:57.309 10:38:34 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:57.876 [2024-07-15 10:38:34.823545] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c52f20 PMD being used: compress_qat 00:30:57.876 10:38:34 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:57.876 10:38:34 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:57.876 10:38:34 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:57.876 10:38:34 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:30:57.876 10:38:34 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:57.876 10:38:34 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:57.876 10:38:34 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:58.133 10:38:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:58.133 [ 00:30:58.133 { 00:30:58.133 "name": "Nvme0n1", 00:30:58.133 "aliases": [ 00:30:58.133 "01000000-0000-0000-5cd2-e43197705251" 00:30:58.133 ], 00:30:58.133 "product_name": "NVMe disk", 00:30:58.133 "block_size": 512, 00:30:58.133 "num_blocks": 15002931888, 00:30:58.133 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:30:58.133 "assigned_rate_limits": { 00:30:58.133 "rw_ios_per_sec": 0, 00:30:58.133 "rw_mbytes_per_sec": 0, 00:30:58.133 "r_mbytes_per_sec": 0, 00:30:58.133 "w_mbytes_per_sec": 0 00:30:58.133 }, 00:30:58.133 "claimed": false, 00:30:58.133 "zoned": false, 00:30:58.133 "supported_io_types": { 00:30:58.133 "read": true, 00:30:58.133 "write": true, 00:30:58.133 "unmap": true, 00:30:58.133 "flush": true, 00:30:58.133 "reset": true, 00:30:58.133 "nvme_admin": true, 00:30:58.133 "nvme_io": true, 00:30:58.133 "nvme_io_md": false, 00:30:58.133 "write_zeroes": true, 00:30:58.133 "zcopy": false, 00:30:58.133 "get_zone_info": false, 00:30:58.133 "zone_management": false, 00:30:58.133 "zone_append": false, 00:30:58.133 "compare": false, 00:30:58.133 "compare_and_write": false, 00:30:58.133 "abort": true, 00:30:58.133 "seek_hole": false, 00:30:58.133 "seek_data": false, 00:30:58.133 "copy": false, 00:30:58.133 "nvme_iov_md": false 00:30:58.133 }, 00:30:58.133 "driver_specific": { 00:30:58.133 "nvme": [ 00:30:58.133 { 00:30:58.133 "pci_address": "0000:5e:00.0", 00:30:58.133 "trid": { 00:30:58.133 "trtype": "PCIe", 00:30:58.133 "traddr": "0000:5e:00.0" 00:30:58.133 }, 00:30:58.133 "ctrlr_data": { 00:30:58.133 "cntlid": 0, 00:30:58.133 "vendor_id": "0x8086", 00:30:58.133 "model_number": "INTEL SSDPF2KX076TZO", 00:30:58.133 "serial_number": "PHAC0301002G7P6CGN", 00:30:58.133 "firmware_revision": "JCV10200", 00:30:58.133 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:30:58.133 "oacs": { 00:30:58.133 "security": 1, 00:30:58.133 "format": 1, 00:30:58.133 "firmware": 1, 00:30:58.133 "ns_manage": 1 00:30:58.133 }, 00:30:58.133 "multi_ctrlr": false, 00:30:58.133 "ana_reporting": false 00:30:58.133 }, 00:30:58.133 "vs": { 00:30:58.133 "nvme_version": "1.3" 00:30:58.133 }, 00:30:58.133 "ns_data": { 00:30:58.133 "id": 1, 00:30:58.133 "can_share": false 00:30:58.133 }, 00:30:58.133 "security": { 00:30:58.133 "opal": true 00:30:58.133 } 00:30:58.133 } 00:30:58.133 ], 00:30:58.133 "mp_policy": "active_passive" 00:30:58.133 } 00:30:58.133 } 00:30:58.133 ] 00:30:58.133 10:38:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:30:58.133 10:38:35 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:58.390 [2024-07-15 10:38:35.432503] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1aa1440 PMD being used: compress_qat 00:31:00.969 dadfa617-a9be-4b5d-a788-46bf5e06f202 00:31:00.969 10:38:37 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:00.969 bf83e11f-4715-440d-8d40-6bf49dc3ad5c 00:31:00.969 10:38:37 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:00.969 10:38:37 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:00.969 10:38:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:00.969 10:38:37 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:00.969 10:38:37 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:00.969 10:38:37 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:00.969 10:38:37 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:00.969 10:38:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:01.228 [ 00:31:01.228 { 00:31:01.228 "name": "bf83e11f-4715-440d-8d40-6bf49dc3ad5c", 00:31:01.228 "aliases": [ 00:31:01.228 "lvs0/lv0" 00:31:01.228 ], 00:31:01.228 "product_name": "Logical Volume", 00:31:01.228 "block_size": 512, 00:31:01.228 "num_blocks": 204800, 00:31:01.228 "uuid": "bf83e11f-4715-440d-8d40-6bf49dc3ad5c", 00:31:01.228 "assigned_rate_limits": { 00:31:01.228 "rw_ios_per_sec": 0, 00:31:01.228 "rw_mbytes_per_sec": 0, 00:31:01.228 "r_mbytes_per_sec": 0, 00:31:01.228 "w_mbytes_per_sec": 0 00:31:01.228 }, 00:31:01.228 "claimed": false, 00:31:01.228 "zoned": false, 00:31:01.228 "supported_io_types": { 00:31:01.228 "read": true, 00:31:01.228 "write": true, 00:31:01.228 "unmap": true, 00:31:01.228 "flush": false, 00:31:01.228 "reset": true, 00:31:01.228 "nvme_admin": false, 00:31:01.228 "nvme_io": false, 00:31:01.228 "nvme_io_md": false, 00:31:01.228 "write_zeroes": true, 00:31:01.228 "zcopy": false, 00:31:01.228 "get_zone_info": false, 00:31:01.228 "zone_management": false, 00:31:01.228 "zone_append": false, 00:31:01.228 "compare": false, 00:31:01.228 "compare_and_write": false, 00:31:01.228 "abort": false, 00:31:01.228 "seek_hole": true, 00:31:01.228 "seek_data": true, 00:31:01.228 "copy": false, 00:31:01.228 "nvme_iov_md": false 00:31:01.228 }, 00:31:01.228 "driver_specific": { 00:31:01.228 "lvol": { 00:31:01.228 "lvol_store_uuid": "dadfa617-a9be-4b5d-a788-46bf5e06f202", 00:31:01.228 "base_bdev": "Nvme0n1", 00:31:01.228 "thin_provision": true, 00:31:01.228 "num_allocated_clusters": 0, 00:31:01.228 "snapshot": false, 00:31:01.228 "clone": false, 00:31:01.228 "esnap_clone": false 00:31:01.228 } 00:31:01.228 } 00:31:01.228 } 00:31:01.228 ] 00:31:01.228 10:38:38 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:01.228 10:38:38 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:01.228 10:38:38 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:01.228 [2024-07-15 10:38:38.399442] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:01.228 COMP_lvs0/lv0 00:31:01.486 10:38:38 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:01.486 10:38:38 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:01.486 10:38:38 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:01.486 10:38:38 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:01.486 10:38:38 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:01.486 10:38:38 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:01.486 10:38:38 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:01.486 10:38:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:01.743 [ 00:31:01.743 { 00:31:01.743 "name": "COMP_lvs0/lv0", 00:31:01.743 "aliases": [ 00:31:01.743 "4af26033-d96f-5d7c-aae0-bd2a1dd3fe66" 00:31:01.743 ], 00:31:01.743 "product_name": "compress", 00:31:01.743 "block_size": 512, 00:31:01.743 "num_blocks": 200704, 00:31:01.743 "uuid": "4af26033-d96f-5d7c-aae0-bd2a1dd3fe66", 00:31:01.743 "assigned_rate_limits": { 00:31:01.743 "rw_ios_per_sec": 0, 00:31:01.743 "rw_mbytes_per_sec": 0, 00:31:01.743 "r_mbytes_per_sec": 0, 00:31:01.743 "w_mbytes_per_sec": 0 00:31:01.743 }, 00:31:01.743 "claimed": false, 00:31:01.743 "zoned": false, 00:31:01.743 "supported_io_types": { 00:31:01.743 "read": true, 00:31:01.743 "write": true, 00:31:01.743 "unmap": false, 00:31:01.743 "flush": false, 00:31:01.743 "reset": false, 00:31:01.743 "nvme_admin": false, 00:31:01.743 "nvme_io": false, 00:31:01.743 "nvme_io_md": false, 00:31:01.743 "write_zeroes": true, 00:31:01.743 "zcopy": false, 00:31:01.743 "get_zone_info": false, 00:31:01.743 "zone_management": false, 00:31:01.743 "zone_append": false, 00:31:01.743 "compare": false, 00:31:01.743 "compare_and_write": false, 00:31:01.743 "abort": false, 00:31:01.743 "seek_hole": false, 00:31:01.743 "seek_data": false, 00:31:01.743 "copy": false, 00:31:01.743 "nvme_iov_md": false 00:31:01.743 }, 00:31:01.743 "driver_specific": { 00:31:01.743 "compress": { 00:31:01.743 "name": "COMP_lvs0/lv0", 00:31:01.743 "base_bdev_name": "bf83e11f-4715-440d-8d40-6bf49dc3ad5c" 00:31:01.743 } 00:31:01.743 } 00:31:01.743 } 00:31:01.743 ] 00:31:01.743 10:38:38 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:01.743 10:38:38 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:01.743 [2024-07-15 10:38:38.912236] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fdfc81b1350 PMD being used: compress_qat 00:31:01.743 I/O targets: 00:31:01.743 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:01.744 00:31:01.744 00:31:01.744 CUnit - A unit testing framework for C - Version 2.1-3 00:31:01.744 http://cunit.sourceforge.net/ 00:31:01.744 00:31:01.744 00:31:01.744 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:01.744 Test: blockdev write read block ...passed 00:31:01.744 Test: blockdev write zeroes read block ...passed 00:31:01.744 Test: blockdev write zeroes read no split ...passed 00:31:01.744 Test: blockdev write zeroes read split ...passed 00:31:02.001 Test: blockdev write zeroes read split partial ...passed 00:31:02.001 Test: blockdev reset ...[2024-07-15 10:38:38.950217] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:02.001 passed 00:31:02.001 Test: blockdev write read 8 blocks ...passed 00:31:02.001 Test: blockdev write read size > 128k ...passed 00:31:02.001 Test: blockdev write read invalid size ...passed 00:31:02.001 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:02.001 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:02.001 Test: blockdev write read max offset ...passed 00:31:02.001 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:02.001 Test: blockdev writev readv 8 blocks ...passed 00:31:02.001 Test: blockdev writev readv 30 x 1block ...passed 00:31:02.001 Test: blockdev writev readv block ...passed 00:31:02.001 Test: blockdev writev readv size > 128k ...passed 00:31:02.001 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:02.001 Test: blockdev comparev and writev ...passed 00:31:02.001 Test: blockdev nvme passthru rw ...passed 00:31:02.001 Test: blockdev nvme passthru vendor specific ...passed 00:31:02.001 Test: blockdev nvme admin passthru ...passed 00:31:02.001 Test: blockdev copy ...passed 00:31:02.001 00:31:02.001 Run Summary: Type Total Ran Passed Failed Inactive 00:31:02.001 suites 1 1 n/a 0 0 00:31:02.001 tests 23 23 23 0 0 00:31:02.001 asserts 130 130 130 0 n/a 00:31:02.001 00:31:02.001 Elapsed time = 0.092 seconds 00:31:02.001 0 00:31:02.001 10:38:38 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:31:02.001 10:38:38 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:02.275 10:38:39 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:02.275 10:38:39 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:02.275 10:38:39 compress_compdev -- compress/compress.sh@62 -- # killprocess 645790 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 645790 ']' 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 645790 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 645790 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 645790' 00:31:02.275 killing process with pid 645790 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@967 -- # kill 645790 00:31:02.275 10:38:39 compress_compdev -- common/autotest_common.sh@972 -- # wait 645790 00:31:05.541 10:38:42 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:05.541 10:38:42 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:05.541 00:31:05.541 real 0m46.824s 00:31:05.541 user 1m46.513s 00:31:05.541 sys 0m5.634s 00:31:05.541 10:38:42 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:05.541 10:38:42 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:05.541 ************************************ 00:31:05.541 END TEST compress_compdev 00:31:05.541 ************************************ 00:31:05.541 10:38:42 -- common/autotest_common.sh@1142 -- # return 0 00:31:05.541 10:38:42 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:05.541 10:38:42 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:05.541 10:38:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:05.541 10:38:42 -- common/autotest_common.sh@10 -- # set +x 00:31:05.541 ************************************ 00:31:05.541 START TEST compress_isal 00:31:05.541 ************************************ 00:31:05.541 10:38:42 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:31:05.541 * Looking for test storage... 00:31:05.541 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:05.541 10:38:42 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:05.541 10:38:42 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:05.541 10:38:42 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:05.541 10:38:42 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:05.541 10:38:42 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:05.541 10:38:42 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:05.541 10:38:42 compress_isal -- paths/export.sh@5 -- # export PATH 00:31:05.541 10:38:42 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@47 -- # : 0 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:05.541 10:38:42 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=647046 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@73 -- # waitforlisten 647046 00:31:05.541 10:38:42 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 647046 ']' 00:31:05.541 10:38:42 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:05.541 10:38:42 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:05.541 10:38:42 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:05.541 10:38:42 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:05.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:05.541 10:38:42 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:05.541 10:38:42 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:05.799 [2024-07-15 10:38:42.772258] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:31:05.799 [2024-07-15 10:38:42.772334] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid647046 ] 00:31:05.799 [2024-07-15 10:38:42.893243] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:05.799 [2024-07-15 10:38:42.996308] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:05.799 [2024-07-15 10:38:42.996314] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:06.729 10:38:43 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:06.729 10:38:43 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:06.729 10:38:43 compress_isal -- compress/compress.sh@74 -- # create_vols 00:31:06.729 10:38:43 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:06.729 10:38:43 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:07.295 10:38:44 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:07.295 10:38:44 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:07.295 10:38:44 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:07.295 10:38:44 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:07.295 10:38:44 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:07.295 10:38:44 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:07.295 10:38:44 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:07.295 10:38:44 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:07.553 [ 00:31:07.553 { 00:31:07.553 "name": "Nvme0n1", 00:31:07.553 "aliases": [ 00:31:07.553 "01000000-0000-0000-5cd2-e43197705251" 00:31:07.553 ], 00:31:07.553 "product_name": "NVMe disk", 00:31:07.553 "block_size": 512, 00:31:07.553 "num_blocks": 15002931888, 00:31:07.553 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:07.553 "assigned_rate_limits": { 00:31:07.553 "rw_ios_per_sec": 0, 00:31:07.553 "rw_mbytes_per_sec": 0, 00:31:07.553 "r_mbytes_per_sec": 0, 00:31:07.553 "w_mbytes_per_sec": 0 00:31:07.553 }, 00:31:07.553 "claimed": false, 00:31:07.553 "zoned": false, 00:31:07.553 "supported_io_types": { 00:31:07.553 "read": true, 00:31:07.553 "write": true, 00:31:07.553 "unmap": true, 00:31:07.553 "flush": true, 00:31:07.553 "reset": true, 00:31:07.553 "nvme_admin": true, 00:31:07.553 "nvme_io": true, 00:31:07.553 "nvme_io_md": false, 00:31:07.553 "write_zeroes": true, 00:31:07.553 "zcopy": false, 00:31:07.553 "get_zone_info": false, 00:31:07.553 "zone_management": false, 00:31:07.553 "zone_append": false, 00:31:07.553 "compare": false, 00:31:07.553 "compare_and_write": false, 00:31:07.553 "abort": true, 00:31:07.553 "seek_hole": false, 00:31:07.553 "seek_data": false, 00:31:07.553 "copy": false, 00:31:07.553 "nvme_iov_md": false 00:31:07.553 }, 00:31:07.553 "driver_specific": { 00:31:07.553 "nvme": [ 00:31:07.553 { 00:31:07.553 "pci_address": "0000:5e:00.0", 00:31:07.553 "trid": { 00:31:07.553 "trtype": "PCIe", 00:31:07.553 "traddr": "0000:5e:00.0" 00:31:07.553 }, 00:31:07.553 "ctrlr_data": { 00:31:07.553 "cntlid": 0, 00:31:07.553 "vendor_id": "0x8086", 00:31:07.553 "model_number": "INTEL SSDPF2KX076TZO", 00:31:07.553 "serial_number": "PHAC0301002G7P6CGN", 00:31:07.553 "firmware_revision": "JCV10200", 00:31:07.553 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:07.553 "oacs": { 00:31:07.553 "security": 1, 00:31:07.553 "format": 1, 00:31:07.553 "firmware": 1, 00:31:07.553 "ns_manage": 1 00:31:07.553 }, 00:31:07.553 "multi_ctrlr": false, 00:31:07.553 "ana_reporting": false 00:31:07.553 }, 00:31:07.553 "vs": { 00:31:07.553 "nvme_version": "1.3" 00:31:07.553 }, 00:31:07.553 "ns_data": { 00:31:07.553 "id": 1, 00:31:07.553 "can_share": false 00:31:07.553 }, 00:31:07.553 "security": { 00:31:07.553 "opal": true 00:31:07.553 } 00:31:07.553 } 00:31:07.553 ], 00:31:07.553 "mp_policy": "active_passive" 00:31:07.553 } 00:31:07.553 } 00:31:07.553 ] 00:31:07.553 10:38:44 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:07.553 10:38:44 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:10.078 b12fb779-31bb-4ac7-88ab-ab807cd4df11 00:31:10.078 10:38:47 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:10.334 93d92d47-bfe7-4aab-a33d-0e9918399085 00:31:10.334 10:38:47 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:10.334 10:38:47 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:10.334 10:38:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:10.334 10:38:47 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:10.334 10:38:47 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:10.334 10:38:47 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:10.334 10:38:47 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:10.590 10:38:47 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:10.847 [ 00:31:10.847 { 00:31:10.847 "name": "93d92d47-bfe7-4aab-a33d-0e9918399085", 00:31:10.847 "aliases": [ 00:31:10.847 "lvs0/lv0" 00:31:10.847 ], 00:31:10.847 "product_name": "Logical Volume", 00:31:10.847 "block_size": 512, 00:31:10.847 "num_blocks": 204800, 00:31:10.847 "uuid": "93d92d47-bfe7-4aab-a33d-0e9918399085", 00:31:10.847 "assigned_rate_limits": { 00:31:10.847 "rw_ios_per_sec": 0, 00:31:10.847 "rw_mbytes_per_sec": 0, 00:31:10.847 "r_mbytes_per_sec": 0, 00:31:10.847 "w_mbytes_per_sec": 0 00:31:10.847 }, 00:31:10.847 "claimed": false, 00:31:10.847 "zoned": false, 00:31:10.847 "supported_io_types": { 00:31:10.847 "read": true, 00:31:10.847 "write": true, 00:31:10.847 "unmap": true, 00:31:10.847 "flush": false, 00:31:10.847 "reset": true, 00:31:10.847 "nvme_admin": false, 00:31:10.847 "nvme_io": false, 00:31:10.847 "nvme_io_md": false, 00:31:10.847 "write_zeroes": true, 00:31:10.847 "zcopy": false, 00:31:10.847 "get_zone_info": false, 00:31:10.847 "zone_management": false, 00:31:10.847 "zone_append": false, 00:31:10.847 "compare": false, 00:31:10.847 "compare_and_write": false, 00:31:10.847 "abort": false, 00:31:10.847 "seek_hole": true, 00:31:10.847 "seek_data": true, 00:31:10.847 "copy": false, 00:31:10.847 "nvme_iov_md": false 00:31:10.847 }, 00:31:10.847 "driver_specific": { 00:31:10.847 "lvol": { 00:31:10.847 "lvol_store_uuid": "b12fb779-31bb-4ac7-88ab-ab807cd4df11", 00:31:10.847 "base_bdev": "Nvme0n1", 00:31:10.847 "thin_provision": true, 00:31:10.847 "num_allocated_clusters": 0, 00:31:10.847 "snapshot": false, 00:31:10.847 "clone": false, 00:31:10.847 "esnap_clone": false 00:31:10.847 } 00:31:10.847 } 00:31:10.847 } 00:31:10.847 ] 00:31:10.847 10:38:47 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:10.847 10:38:47 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:10.847 10:38:47 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:11.104 [2024-07-15 10:38:48.078822] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:11.104 COMP_lvs0/lv0 00:31:11.104 10:38:48 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:11.104 10:38:48 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:11.104 10:38:48 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:11.104 10:38:48 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:11.104 10:38:48 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:11.104 10:38:48 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:11.104 10:38:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:11.361 10:38:48 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:11.618 [ 00:31:11.618 { 00:31:11.618 "name": "COMP_lvs0/lv0", 00:31:11.618 "aliases": [ 00:31:11.618 "bab68eae-58a4-56b6-b30f-dd79b30d7613" 00:31:11.618 ], 00:31:11.618 "product_name": "compress", 00:31:11.618 "block_size": 512, 00:31:11.618 "num_blocks": 200704, 00:31:11.618 "uuid": "bab68eae-58a4-56b6-b30f-dd79b30d7613", 00:31:11.618 "assigned_rate_limits": { 00:31:11.618 "rw_ios_per_sec": 0, 00:31:11.618 "rw_mbytes_per_sec": 0, 00:31:11.618 "r_mbytes_per_sec": 0, 00:31:11.618 "w_mbytes_per_sec": 0 00:31:11.618 }, 00:31:11.619 "claimed": false, 00:31:11.619 "zoned": false, 00:31:11.619 "supported_io_types": { 00:31:11.619 "read": true, 00:31:11.619 "write": true, 00:31:11.619 "unmap": false, 00:31:11.619 "flush": false, 00:31:11.619 "reset": false, 00:31:11.619 "nvme_admin": false, 00:31:11.619 "nvme_io": false, 00:31:11.619 "nvme_io_md": false, 00:31:11.619 "write_zeroes": true, 00:31:11.619 "zcopy": false, 00:31:11.619 "get_zone_info": false, 00:31:11.619 "zone_management": false, 00:31:11.619 "zone_append": false, 00:31:11.619 "compare": false, 00:31:11.619 "compare_and_write": false, 00:31:11.619 "abort": false, 00:31:11.619 "seek_hole": false, 00:31:11.619 "seek_data": false, 00:31:11.619 "copy": false, 00:31:11.619 "nvme_iov_md": false 00:31:11.619 }, 00:31:11.619 "driver_specific": { 00:31:11.619 "compress": { 00:31:11.619 "name": "COMP_lvs0/lv0", 00:31:11.619 "base_bdev_name": "93d92d47-bfe7-4aab-a33d-0e9918399085" 00:31:11.619 } 00:31:11.619 } 00:31:11.619 } 00:31:11.619 ] 00:31:11.619 10:38:48 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:11.619 10:38:48 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:11.619 Running I/O for 3 seconds... 00:31:14.895 00:31:14.895 Latency(us) 00:31:14.895 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:14.895 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:14.895 Verification LBA range: start 0x0 length 0x3100 00:31:14.895 COMP_lvs0/lv0 : 3.01 2909.82 11.37 0.00 0.00 10951.12 669.61 9687.93 00:31:14.895 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:14.895 Verification LBA range: start 0x3100 length 0x3100 00:31:14.895 COMP_lvs0/lv0 : 3.01 2908.16 11.36 0.00 0.00 10965.40 1047.15 9402.99 00:31:14.895 =================================================================================================================== 00:31:14.895 Total : 5817.98 22.73 0.00 0.00 10958.26 669.61 9687.93 00:31:14.895 0 00:31:14.895 10:38:51 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:14.896 10:38:51 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:14.896 10:38:52 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:15.153 10:38:52 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:15.153 10:38:52 compress_isal -- compress/compress.sh@78 -- # killprocess 647046 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 647046 ']' 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@952 -- # kill -0 647046 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 647046 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 647046' 00:31:15.153 killing process with pid 647046 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@967 -- # kill 647046 00:31:15.153 Received shutdown signal, test time was about 3.000000 seconds 00:31:15.153 00:31:15.153 Latency(us) 00:31:15.153 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:15.153 =================================================================================================================== 00:31:15.153 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:15.153 10:38:52 compress_isal -- common/autotest_common.sh@972 -- # wait 647046 00:31:18.429 10:38:55 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:31:18.429 10:38:55 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:18.429 10:38:55 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=648651 00:31:18.429 10:38:55 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:18.429 10:38:55 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:18.429 10:38:55 compress_isal -- compress/compress.sh@73 -- # waitforlisten 648651 00:31:18.429 10:38:55 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 648651 ']' 00:31:18.429 10:38:55 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:18.429 10:38:55 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:18.429 10:38:55 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:18.429 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:18.429 10:38:55 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:18.429 10:38:55 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:18.429 [2024-07-15 10:38:55.357946] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:31:18.429 [2024-07-15 10:38:55.358019] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid648651 ] 00:31:18.429 [2024-07-15 10:38:55.479534] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:18.429 [2024-07-15 10:38:55.578430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:18.429 [2024-07-15 10:38:55.578436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:19.361 10:38:56 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:19.361 10:38:56 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:19.361 10:38:56 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:31:19.361 10:38:56 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:19.361 10:38:56 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:19.927 10:38:56 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:19.927 10:38:56 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:19.927 10:38:56 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:19.927 10:38:56 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:19.927 10:38:56 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:19.927 10:38:56 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:19.927 10:38:56 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:20.184 10:38:57 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:20.184 [ 00:31:20.184 { 00:31:20.184 "name": "Nvme0n1", 00:31:20.184 "aliases": [ 00:31:20.184 "01000000-0000-0000-5cd2-e43197705251" 00:31:20.184 ], 00:31:20.184 "product_name": "NVMe disk", 00:31:20.184 "block_size": 512, 00:31:20.184 "num_blocks": 15002931888, 00:31:20.184 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:20.184 "assigned_rate_limits": { 00:31:20.184 "rw_ios_per_sec": 0, 00:31:20.184 "rw_mbytes_per_sec": 0, 00:31:20.184 "r_mbytes_per_sec": 0, 00:31:20.184 "w_mbytes_per_sec": 0 00:31:20.184 }, 00:31:20.184 "claimed": false, 00:31:20.184 "zoned": false, 00:31:20.184 "supported_io_types": { 00:31:20.184 "read": true, 00:31:20.184 "write": true, 00:31:20.184 "unmap": true, 00:31:20.184 "flush": true, 00:31:20.184 "reset": true, 00:31:20.184 "nvme_admin": true, 00:31:20.184 "nvme_io": true, 00:31:20.184 "nvme_io_md": false, 00:31:20.184 "write_zeroes": true, 00:31:20.184 "zcopy": false, 00:31:20.184 "get_zone_info": false, 00:31:20.184 "zone_management": false, 00:31:20.184 "zone_append": false, 00:31:20.184 "compare": false, 00:31:20.184 "compare_and_write": false, 00:31:20.184 "abort": true, 00:31:20.184 "seek_hole": false, 00:31:20.184 "seek_data": false, 00:31:20.184 "copy": false, 00:31:20.184 "nvme_iov_md": false 00:31:20.184 }, 00:31:20.184 "driver_specific": { 00:31:20.184 "nvme": [ 00:31:20.184 { 00:31:20.184 "pci_address": "0000:5e:00.0", 00:31:20.184 "trid": { 00:31:20.184 "trtype": "PCIe", 00:31:20.184 "traddr": "0000:5e:00.0" 00:31:20.184 }, 00:31:20.184 "ctrlr_data": { 00:31:20.185 "cntlid": 0, 00:31:20.185 "vendor_id": "0x8086", 00:31:20.185 "model_number": "INTEL SSDPF2KX076TZO", 00:31:20.185 "serial_number": "PHAC0301002G7P6CGN", 00:31:20.185 "firmware_revision": "JCV10200", 00:31:20.185 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:20.185 "oacs": { 00:31:20.185 "security": 1, 00:31:20.185 "format": 1, 00:31:20.185 "firmware": 1, 00:31:20.185 "ns_manage": 1 00:31:20.185 }, 00:31:20.185 "multi_ctrlr": false, 00:31:20.185 "ana_reporting": false 00:31:20.185 }, 00:31:20.185 "vs": { 00:31:20.185 "nvme_version": "1.3" 00:31:20.185 }, 00:31:20.185 "ns_data": { 00:31:20.185 "id": 1, 00:31:20.185 "can_share": false 00:31:20.185 }, 00:31:20.185 "security": { 00:31:20.185 "opal": true 00:31:20.185 } 00:31:20.185 } 00:31:20.185 ], 00:31:20.185 "mp_policy": "active_passive" 00:31:20.185 } 00:31:20.185 } 00:31:20.185 ] 00:31:20.185 10:38:57 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:20.185 10:38:57 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:22.713 bc90d250-8514-4311-bb63-4d104c6902f4 00:31:22.713 10:38:59 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:23.050 8c6ae750-077a-4786-9d7e-7ff4bcd5c484 00:31:23.050 10:39:00 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:23.050 10:39:00 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:23.050 10:39:00 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:23.050 10:39:00 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:23.050 10:39:00 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:23.050 10:39:00 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:23.050 10:39:00 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:23.308 10:39:00 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:23.566 [ 00:31:23.566 { 00:31:23.566 "name": "8c6ae750-077a-4786-9d7e-7ff4bcd5c484", 00:31:23.566 "aliases": [ 00:31:23.566 "lvs0/lv0" 00:31:23.566 ], 00:31:23.566 "product_name": "Logical Volume", 00:31:23.566 "block_size": 512, 00:31:23.566 "num_blocks": 204800, 00:31:23.566 "uuid": "8c6ae750-077a-4786-9d7e-7ff4bcd5c484", 00:31:23.566 "assigned_rate_limits": { 00:31:23.566 "rw_ios_per_sec": 0, 00:31:23.566 "rw_mbytes_per_sec": 0, 00:31:23.566 "r_mbytes_per_sec": 0, 00:31:23.566 "w_mbytes_per_sec": 0 00:31:23.566 }, 00:31:23.566 "claimed": false, 00:31:23.566 "zoned": false, 00:31:23.566 "supported_io_types": { 00:31:23.566 "read": true, 00:31:23.566 "write": true, 00:31:23.566 "unmap": true, 00:31:23.566 "flush": false, 00:31:23.566 "reset": true, 00:31:23.566 "nvme_admin": false, 00:31:23.566 "nvme_io": false, 00:31:23.566 "nvme_io_md": false, 00:31:23.566 "write_zeroes": true, 00:31:23.566 "zcopy": false, 00:31:23.566 "get_zone_info": false, 00:31:23.566 "zone_management": false, 00:31:23.566 "zone_append": false, 00:31:23.566 "compare": false, 00:31:23.566 "compare_and_write": false, 00:31:23.566 "abort": false, 00:31:23.566 "seek_hole": true, 00:31:23.566 "seek_data": true, 00:31:23.566 "copy": false, 00:31:23.566 "nvme_iov_md": false 00:31:23.566 }, 00:31:23.566 "driver_specific": { 00:31:23.566 "lvol": { 00:31:23.566 "lvol_store_uuid": "bc90d250-8514-4311-bb63-4d104c6902f4", 00:31:23.566 "base_bdev": "Nvme0n1", 00:31:23.566 "thin_provision": true, 00:31:23.566 "num_allocated_clusters": 0, 00:31:23.566 "snapshot": false, 00:31:23.566 "clone": false, 00:31:23.566 "esnap_clone": false 00:31:23.566 } 00:31:23.566 } 00:31:23.566 } 00:31:23.566 ] 00:31:23.566 10:39:00 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:23.566 10:39:00 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:31:23.566 10:39:00 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:31:23.824 [2024-07-15 10:39:00.803119] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:23.824 COMP_lvs0/lv0 00:31:23.824 10:39:00 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:23.824 10:39:00 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:23.824 10:39:00 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:23.824 10:39:00 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:23.824 10:39:00 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:23.824 10:39:00 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:23.824 10:39:00 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:24.082 10:39:01 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:24.340 [ 00:31:24.340 { 00:31:24.340 "name": "COMP_lvs0/lv0", 00:31:24.340 "aliases": [ 00:31:24.340 "f92acaf3-e7ce-5e2c-b38f-82ad3dc7ffe9" 00:31:24.340 ], 00:31:24.340 "product_name": "compress", 00:31:24.340 "block_size": 512, 00:31:24.340 "num_blocks": 200704, 00:31:24.340 "uuid": "f92acaf3-e7ce-5e2c-b38f-82ad3dc7ffe9", 00:31:24.340 "assigned_rate_limits": { 00:31:24.340 "rw_ios_per_sec": 0, 00:31:24.340 "rw_mbytes_per_sec": 0, 00:31:24.340 "r_mbytes_per_sec": 0, 00:31:24.340 "w_mbytes_per_sec": 0 00:31:24.340 }, 00:31:24.340 "claimed": false, 00:31:24.340 "zoned": false, 00:31:24.340 "supported_io_types": { 00:31:24.340 "read": true, 00:31:24.340 "write": true, 00:31:24.340 "unmap": false, 00:31:24.340 "flush": false, 00:31:24.340 "reset": false, 00:31:24.340 "nvme_admin": false, 00:31:24.340 "nvme_io": false, 00:31:24.340 "nvme_io_md": false, 00:31:24.340 "write_zeroes": true, 00:31:24.340 "zcopy": false, 00:31:24.340 "get_zone_info": false, 00:31:24.340 "zone_management": false, 00:31:24.340 "zone_append": false, 00:31:24.340 "compare": false, 00:31:24.340 "compare_and_write": false, 00:31:24.340 "abort": false, 00:31:24.340 "seek_hole": false, 00:31:24.340 "seek_data": false, 00:31:24.340 "copy": false, 00:31:24.340 "nvme_iov_md": false 00:31:24.340 }, 00:31:24.340 "driver_specific": { 00:31:24.340 "compress": { 00:31:24.340 "name": "COMP_lvs0/lv0", 00:31:24.340 "base_bdev_name": "8c6ae750-077a-4786-9d7e-7ff4bcd5c484" 00:31:24.340 } 00:31:24.340 } 00:31:24.340 } 00:31:24.340 ] 00:31:24.340 10:39:01 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:24.341 10:39:01 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:24.341 Running I/O for 3 seconds... 00:31:27.618 00:31:27.618 Latency(us) 00:31:27.618 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:27.618 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:27.618 Verification LBA range: start 0x0 length 0x3100 00:31:27.618 COMP_lvs0/lv0 : 3.00 3879.60 15.15 0.00 0.00 8193.03 772.90 7921.31 00:31:27.618 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:27.618 Verification LBA range: start 0x3100 length 0x3100 00:31:27.618 COMP_lvs0/lv0 : 3.00 3885.42 15.18 0.00 0.00 8193.54 516.45 8092.27 00:31:27.618 =================================================================================================================== 00:31:27.618 Total : 7765.01 30.33 0.00 0.00 8193.28 516.45 8092.27 00:31:27.618 0 00:31:27.618 10:39:04 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:27.618 10:39:04 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:27.618 10:39:04 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:27.876 10:39:04 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:27.876 10:39:04 compress_isal -- compress/compress.sh@78 -- # killprocess 648651 00:31:27.876 10:39:04 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 648651 ']' 00:31:27.876 10:39:04 compress_isal -- common/autotest_common.sh@952 -- # kill -0 648651 00:31:27.876 10:39:04 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:27.876 10:39:04 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:27.876 10:39:04 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 648651 00:31:27.876 10:39:05 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:27.876 10:39:05 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:27.876 10:39:05 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 648651' 00:31:27.876 killing process with pid 648651 00:31:27.876 10:39:05 compress_isal -- common/autotest_common.sh@967 -- # kill 648651 00:31:27.876 Received shutdown signal, test time was about 3.000000 seconds 00:31:27.876 00:31:27.876 Latency(us) 00:31:27.876 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:27.876 =================================================================================================================== 00:31:27.876 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:27.876 10:39:05 compress_isal -- common/autotest_common.sh@972 -- # wait 648651 00:31:31.154 10:39:07 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:31:31.154 10:39:07 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:31.154 10:39:07 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=650845 00:31:31.154 10:39:07 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:31.154 10:39:07 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:31:31.154 10:39:07 compress_isal -- compress/compress.sh@73 -- # waitforlisten 650845 00:31:31.154 10:39:07 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 650845 ']' 00:31:31.154 10:39:07 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:31.154 10:39:07 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:31.154 10:39:07 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:31.154 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:31.154 10:39:07 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:31.154 10:39:07 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:31.154 [2024-07-15 10:39:08.024961] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:31:31.154 [2024-07-15 10:39:08.025022] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid650845 ] 00:31:31.154 [2024-07-15 10:39:08.129915] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:31.154 [2024-07-15 10:39:08.229591] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:31.154 [2024-07-15 10:39:08.229598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:32.084 10:39:08 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:32.084 10:39:08 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:32.084 10:39:08 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:31:32.084 10:39:08 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:32.084 10:39:08 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:32.647 10:39:09 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:32.647 10:39:09 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:32.647 10:39:09 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:32.647 10:39:09 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:32.647 10:39:09 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:32.647 10:39:09 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:32.647 10:39:09 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:32.904 10:39:09 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:32.904 [ 00:31:32.904 { 00:31:32.904 "name": "Nvme0n1", 00:31:32.904 "aliases": [ 00:31:32.904 "01000000-0000-0000-5cd2-e43197705251" 00:31:32.904 ], 00:31:32.904 "product_name": "NVMe disk", 00:31:32.904 "block_size": 512, 00:31:32.904 "num_blocks": 15002931888, 00:31:32.904 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:32.904 "assigned_rate_limits": { 00:31:32.904 "rw_ios_per_sec": 0, 00:31:32.904 "rw_mbytes_per_sec": 0, 00:31:32.904 "r_mbytes_per_sec": 0, 00:31:32.904 "w_mbytes_per_sec": 0 00:31:32.904 }, 00:31:32.904 "claimed": false, 00:31:32.904 "zoned": false, 00:31:32.904 "supported_io_types": { 00:31:32.904 "read": true, 00:31:32.904 "write": true, 00:31:32.904 "unmap": true, 00:31:32.904 "flush": true, 00:31:32.904 "reset": true, 00:31:32.904 "nvme_admin": true, 00:31:32.904 "nvme_io": true, 00:31:32.904 "nvme_io_md": false, 00:31:32.904 "write_zeroes": true, 00:31:32.904 "zcopy": false, 00:31:32.904 "get_zone_info": false, 00:31:32.904 "zone_management": false, 00:31:32.904 "zone_append": false, 00:31:32.904 "compare": false, 00:31:32.904 "compare_and_write": false, 00:31:32.904 "abort": true, 00:31:32.904 "seek_hole": false, 00:31:32.904 "seek_data": false, 00:31:32.904 "copy": false, 00:31:32.904 "nvme_iov_md": false 00:31:32.904 }, 00:31:32.904 "driver_specific": { 00:31:32.904 "nvme": [ 00:31:32.904 { 00:31:32.904 "pci_address": "0000:5e:00.0", 00:31:32.904 "trid": { 00:31:32.904 "trtype": "PCIe", 00:31:32.904 "traddr": "0000:5e:00.0" 00:31:32.904 }, 00:31:32.904 "ctrlr_data": { 00:31:32.904 "cntlid": 0, 00:31:32.904 "vendor_id": "0x8086", 00:31:32.904 "model_number": "INTEL SSDPF2KX076TZO", 00:31:32.904 "serial_number": "PHAC0301002G7P6CGN", 00:31:32.904 "firmware_revision": "JCV10200", 00:31:32.904 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:32.904 "oacs": { 00:31:32.904 "security": 1, 00:31:32.904 "format": 1, 00:31:32.904 "firmware": 1, 00:31:32.904 "ns_manage": 1 00:31:32.904 }, 00:31:32.904 "multi_ctrlr": false, 00:31:32.904 "ana_reporting": false 00:31:32.904 }, 00:31:32.904 "vs": { 00:31:32.904 "nvme_version": "1.3" 00:31:32.904 }, 00:31:32.904 "ns_data": { 00:31:32.904 "id": 1, 00:31:32.904 "can_share": false 00:31:32.904 }, 00:31:32.904 "security": { 00:31:32.904 "opal": true 00:31:32.904 } 00:31:32.904 } 00:31:32.904 ], 00:31:32.904 "mp_policy": "active_passive" 00:31:32.904 } 00:31:32.904 } 00:31:32.904 ] 00:31:32.904 10:39:10 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:32.904 10:39:10 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:35.430 b8708106-765d-4a41-bdc9-49377f54bfe2 00:31:35.430 10:39:12 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:35.688 bf4e0073-7f95-4bb0-879c-64e821a8d1ec 00:31:35.688 10:39:12 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:35.688 10:39:12 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:35.688 10:39:12 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:35.688 10:39:12 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:35.688 10:39:12 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:35.688 10:39:12 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:35.688 10:39:12 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:35.945 10:39:12 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:36.202 [ 00:31:36.202 { 00:31:36.202 "name": "bf4e0073-7f95-4bb0-879c-64e821a8d1ec", 00:31:36.202 "aliases": [ 00:31:36.202 "lvs0/lv0" 00:31:36.202 ], 00:31:36.202 "product_name": "Logical Volume", 00:31:36.202 "block_size": 512, 00:31:36.202 "num_blocks": 204800, 00:31:36.202 "uuid": "bf4e0073-7f95-4bb0-879c-64e821a8d1ec", 00:31:36.202 "assigned_rate_limits": { 00:31:36.202 "rw_ios_per_sec": 0, 00:31:36.202 "rw_mbytes_per_sec": 0, 00:31:36.202 "r_mbytes_per_sec": 0, 00:31:36.202 "w_mbytes_per_sec": 0 00:31:36.202 }, 00:31:36.202 "claimed": false, 00:31:36.202 "zoned": false, 00:31:36.202 "supported_io_types": { 00:31:36.202 "read": true, 00:31:36.202 "write": true, 00:31:36.202 "unmap": true, 00:31:36.202 "flush": false, 00:31:36.202 "reset": true, 00:31:36.202 "nvme_admin": false, 00:31:36.202 "nvme_io": false, 00:31:36.202 "nvme_io_md": false, 00:31:36.202 "write_zeroes": true, 00:31:36.202 "zcopy": false, 00:31:36.202 "get_zone_info": false, 00:31:36.202 "zone_management": false, 00:31:36.202 "zone_append": false, 00:31:36.202 "compare": false, 00:31:36.202 "compare_and_write": false, 00:31:36.202 "abort": false, 00:31:36.202 "seek_hole": true, 00:31:36.202 "seek_data": true, 00:31:36.202 "copy": false, 00:31:36.202 "nvme_iov_md": false 00:31:36.202 }, 00:31:36.202 "driver_specific": { 00:31:36.202 "lvol": { 00:31:36.202 "lvol_store_uuid": "b8708106-765d-4a41-bdc9-49377f54bfe2", 00:31:36.202 "base_bdev": "Nvme0n1", 00:31:36.202 "thin_provision": true, 00:31:36.202 "num_allocated_clusters": 0, 00:31:36.202 "snapshot": false, 00:31:36.202 "clone": false, 00:31:36.202 "esnap_clone": false 00:31:36.202 } 00:31:36.202 } 00:31:36.202 } 00:31:36.202 ] 00:31:36.202 10:39:13 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:36.202 10:39:13 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:31:36.202 10:39:13 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:31:36.459 [2024-07-15 10:39:13.454865] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:36.459 COMP_lvs0/lv0 00:31:36.459 10:39:13 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:36.459 10:39:13 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:36.459 10:39:13 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:36.459 10:39:13 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:36.459 10:39:13 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:36.459 10:39:13 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:36.459 10:39:13 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:36.716 10:39:13 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:36.974 [ 00:31:36.974 { 00:31:36.974 "name": "COMP_lvs0/lv0", 00:31:36.974 "aliases": [ 00:31:36.974 "5470b9b2-7064-5805-aa76-79db6b6e20e4" 00:31:36.974 ], 00:31:36.974 "product_name": "compress", 00:31:36.974 "block_size": 4096, 00:31:36.974 "num_blocks": 25088, 00:31:36.974 "uuid": "5470b9b2-7064-5805-aa76-79db6b6e20e4", 00:31:36.974 "assigned_rate_limits": { 00:31:36.974 "rw_ios_per_sec": 0, 00:31:36.974 "rw_mbytes_per_sec": 0, 00:31:36.974 "r_mbytes_per_sec": 0, 00:31:36.974 "w_mbytes_per_sec": 0 00:31:36.974 }, 00:31:36.974 "claimed": false, 00:31:36.974 "zoned": false, 00:31:36.974 "supported_io_types": { 00:31:36.974 "read": true, 00:31:36.974 "write": true, 00:31:36.974 "unmap": false, 00:31:36.974 "flush": false, 00:31:36.974 "reset": false, 00:31:36.974 "nvme_admin": false, 00:31:36.974 "nvme_io": false, 00:31:36.974 "nvme_io_md": false, 00:31:36.974 "write_zeroes": true, 00:31:36.974 "zcopy": false, 00:31:36.974 "get_zone_info": false, 00:31:36.974 "zone_management": false, 00:31:36.974 "zone_append": false, 00:31:36.974 "compare": false, 00:31:36.974 "compare_and_write": false, 00:31:36.974 "abort": false, 00:31:36.974 "seek_hole": false, 00:31:36.974 "seek_data": false, 00:31:36.974 "copy": false, 00:31:36.974 "nvme_iov_md": false 00:31:36.974 }, 00:31:36.974 "driver_specific": { 00:31:36.974 "compress": { 00:31:36.974 "name": "COMP_lvs0/lv0", 00:31:36.974 "base_bdev_name": "bf4e0073-7f95-4bb0-879c-64e821a8d1ec" 00:31:36.974 } 00:31:36.974 } 00:31:36.974 } 00:31:36.974 ] 00:31:36.974 10:39:13 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:36.974 10:39:13 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:36.974 Running I/O for 3 seconds... 00:31:40.257 00:31:40.257 Latency(us) 00:31:40.257 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:40.257 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:31:40.257 Verification LBA range: start 0x0 length 0x3100 00:31:40.257 COMP_lvs0/lv0 : 3.00 3946.44 15.42 0.00 0.00 8054.01 687.42 6867.03 00:31:40.257 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:31:40.257 Verification LBA range: start 0x3100 length 0x3100 00:31:40.257 COMP_lvs0/lv0 : 3.00 3948.71 15.42 0.00 0.00 8062.12 512.89 6924.02 00:31:40.257 =================================================================================================================== 00:31:40.257 Total : 7895.16 30.84 0.00 0.00 8058.06 512.89 6924.02 00:31:40.257 0 00:31:40.257 10:39:17 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:31:40.257 10:39:17 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:40.257 10:39:17 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:40.514 10:39:17 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:31:40.514 10:39:17 compress_isal -- compress/compress.sh@78 -- # killprocess 650845 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 650845 ']' 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@952 -- # kill -0 650845 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 650845 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 650845' 00:31:40.515 killing process with pid 650845 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@967 -- # kill 650845 00:31:40.515 Received shutdown signal, test time was about 3.000000 seconds 00:31:40.515 00:31:40.515 Latency(us) 00:31:40.515 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:40.515 =================================================================================================================== 00:31:40.515 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:31:40.515 10:39:17 compress_isal -- common/autotest_common.sh@972 -- # wait 650845 00:31:43.791 10:39:20 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:31:43.791 10:39:20 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:31:43.791 10:39:20 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=652503 00:31:43.791 10:39:20 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:43.791 10:39:20 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:31:43.791 10:39:20 compress_isal -- compress/compress.sh@57 -- # waitforlisten 652503 00:31:43.792 10:39:20 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 652503 ']' 00:31:43.792 10:39:20 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:43.792 10:39:20 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:43.792 10:39:20 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:43.792 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:43.792 10:39:20 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:43.792 10:39:20 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:43.792 [2024-07-15 10:39:20.714535] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:31:43.792 [2024-07-15 10:39:20.714600] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid652503 ] 00:31:43.792 [2024-07-15 10:39:20.829202] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:43.792 [2024-07-15 10:39:20.933285] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:43.792 [2024-07-15 10:39:20.933369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:43.792 [2024-07-15 10:39:20.933375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:44.723 10:39:21 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:44.724 10:39:21 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:31:44.724 10:39:21 compress_isal -- compress/compress.sh@58 -- # create_vols 00:31:44.724 10:39:21 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:44.724 10:39:21 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:45.289 10:39:22 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:45.289 10:39:22 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:45.289 10:39:22 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:45.289 10:39:22 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:45.289 10:39:22 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:45.289 10:39:22 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:45.289 10:39:22 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:45.289 10:39:22 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:45.546 [ 00:31:45.546 { 00:31:45.546 "name": "Nvme0n1", 00:31:45.546 "aliases": [ 00:31:45.546 "01000000-0000-0000-5cd2-e43197705251" 00:31:45.546 ], 00:31:45.546 "product_name": "NVMe disk", 00:31:45.546 "block_size": 512, 00:31:45.546 "num_blocks": 15002931888, 00:31:45.546 "uuid": "01000000-0000-0000-5cd2-e43197705251", 00:31:45.546 "assigned_rate_limits": { 00:31:45.546 "rw_ios_per_sec": 0, 00:31:45.546 "rw_mbytes_per_sec": 0, 00:31:45.546 "r_mbytes_per_sec": 0, 00:31:45.546 "w_mbytes_per_sec": 0 00:31:45.546 }, 00:31:45.546 "claimed": false, 00:31:45.546 "zoned": false, 00:31:45.546 "supported_io_types": { 00:31:45.546 "read": true, 00:31:45.546 "write": true, 00:31:45.546 "unmap": true, 00:31:45.546 "flush": true, 00:31:45.546 "reset": true, 00:31:45.546 "nvme_admin": true, 00:31:45.546 "nvme_io": true, 00:31:45.546 "nvme_io_md": false, 00:31:45.546 "write_zeroes": true, 00:31:45.546 "zcopy": false, 00:31:45.546 "get_zone_info": false, 00:31:45.546 "zone_management": false, 00:31:45.546 "zone_append": false, 00:31:45.546 "compare": false, 00:31:45.546 "compare_and_write": false, 00:31:45.546 "abort": true, 00:31:45.546 "seek_hole": false, 00:31:45.546 "seek_data": false, 00:31:45.546 "copy": false, 00:31:45.546 "nvme_iov_md": false 00:31:45.546 }, 00:31:45.546 "driver_specific": { 00:31:45.546 "nvme": [ 00:31:45.546 { 00:31:45.546 "pci_address": "0000:5e:00.0", 00:31:45.546 "trid": { 00:31:45.546 "trtype": "PCIe", 00:31:45.546 "traddr": "0000:5e:00.0" 00:31:45.546 }, 00:31:45.546 "ctrlr_data": { 00:31:45.546 "cntlid": 0, 00:31:45.546 "vendor_id": "0x8086", 00:31:45.546 "model_number": "INTEL SSDPF2KX076TZO", 00:31:45.546 "serial_number": "PHAC0301002G7P6CGN", 00:31:45.547 "firmware_revision": "JCV10200", 00:31:45.547 "subnqn": "nqn.2020-07.com.intel:PHAC0301002G7P6CGN ", 00:31:45.547 "oacs": { 00:31:45.547 "security": 1, 00:31:45.547 "format": 1, 00:31:45.547 "firmware": 1, 00:31:45.547 "ns_manage": 1 00:31:45.547 }, 00:31:45.547 "multi_ctrlr": false, 00:31:45.547 "ana_reporting": false 00:31:45.547 }, 00:31:45.547 "vs": { 00:31:45.547 "nvme_version": "1.3" 00:31:45.547 }, 00:31:45.547 "ns_data": { 00:31:45.547 "id": 1, 00:31:45.547 "can_share": false 00:31:45.547 }, 00:31:45.547 "security": { 00:31:45.547 "opal": true 00:31:45.547 } 00:31:45.547 } 00:31:45.547 ], 00:31:45.547 "mp_policy": "active_passive" 00:31:45.547 } 00:31:45.547 } 00:31:45.547 ] 00:31:45.547 10:39:22 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:45.547 10:39:22 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:48.118 b968510b-eec9-4177-9b0d-861c2e15b534 00:31:48.118 10:39:25 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:48.375 876c5653-588c-4a60-bbf1-6fd37ec85b5a 00:31:48.375 10:39:25 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:48.375 10:39:25 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:48.375 10:39:25 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:48.375 10:39:25 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:48.375 10:39:25 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:48.375 10:39:25 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:48.375 10:39:25 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:48.632 10:39:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:48.890 [ 00:31:48.890 { 00:31:48.890 "name": "876c5653-588c-4a60-bbf1-6fd37ec85b5a", 00:31:48.890 "aliases": [ 00:31:48.890 "lvs0/lv0" 00:31:48.890 ], 00:31:48.890 "product_name": "Logical Volume", 00:31:48.890 "block_size": 512, 00:31:48.890 "num_blocks": 204800, 00:31:48.890 "uuid": "876c5653-588c-4a60-bbf1-6fd37ec85b5a", 00:31:48.890 "assigned_rate_limits": { 00:31:48.890 "rw_ios_per_sec": 0, 00:31:48.890 "rw_mbytes_per_sec": 0, 00:31:48.890 "r_mbytes_per_sec": 0, 00:31:48.890 "w_mbytes_per_sec": 0 00:31:48.890 }, 00:31:48.890 "claimed": false, 00:31:48.890 "zoned": false, 00:31:48.891 "supported_io_types": { 00:31:48.891 "read": true, 00:31:48.891 "write": true, 00:31:48.891 "unmap": true, 00:31:48.891 "flush": false, 00:31:48.891 "reset": true, 00:31:48.891 "nvme_admin": false, 00:31:48.891 "nvme_io": false, 00:31:48.891 "nvme_io_md": false, 00:31:48.891 "write_zeroes": true, 00:31:48.891 "zcopy": false, 00:31:48.891 "get_zone_info": false, 00:31:48.891 "zone_management": false, 00:31:48.891 "zone_append": false, 00:31:48.891 "compare": false, 00:31:48.891 "compare_and_write": false, 00:31:48.891 "abort": false, 00:31:48.891 "seek_hole": true, 00:31:48.891 "seek_data": true, 00:31:48.891 "copy": false, 00:31:48.891 "nvme_iov_md": false 00:31:48.891 }, 00:31:48.891 "driver_specific": { 00:31:48.891 "lvol": { 00:31:48.891 "lvol_store_uuid": "b968510b-eec9-4177-9b0d-861c2e15b534", 00:31:48.891 "base_bdev": "Nvme0n1", 00:31:48.891 "thin_provision": true, 00:31:48.891 "num_allocated_clusters": 0, 00:31:48.891 "snapshot": false, 00:31:48.891 "clone": false, 00:31:48.891 "esnap_clone": false 00:31:48.891 } 00:31:48.891 } 00:31:48.891 } 00:31:48.891 ] 00:31:48.891 10:39:25 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:48.891 10:39:25 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:48.891 10:39:25 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:48.891 [2024-07-15 10:39:26.080750] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:48.891 COMP_lvs0/lv0 00:31:49.148 10:39:26 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:49.148 10:39:26 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:49.148 10:39:26 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:49.148 10:39:26 compress_isal -- common/autotest_common.sh@899 -- # local i 00:31:49.148 10:39:26 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:49.148 10:39:26 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:49.148 10:39:26 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:49.148 10:39:26 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:49.405 [ 00:31:49.405 { 00:31:49.405 "name": "COMP_lvs0/lv0", 00:31:49.405 "aliases": [ 00:31:49.405 "60b1770f-a4b3-5c60-a448-c671e2acce2e" 00:31:49.405 ], 00:31:49.405 "product_name": "compress", 00:31:49.405 "block_size": 512, 00:31:49.405 "num_blocks": 200704, 00:31:49.405 "uuid": "60b1770f-a4b3-5c60-a448-c671e2acce2e", 00:31:49.405 "assigned_rate_limits": { 00:31:49.405 "rw_ios_per_sec": 0, 00:31:49.405 "rw_mbytes_per_sec": 0, 00:31:49.405 "r_mbytes_per_sec": 0, 00:31:49.405 "w_mbytes_per_sec": 0 00:31:49.405 }, 00:31:49.405 "claimed": false, 00:31:49.405 "zoned": false, 00:31:49.405 "supported_io_types": { 00:31:49.405 "read": true, 00:31:49.405 "write": true, 00:31:49.405 "unmap": false, 00:31:49.405 "flush": false, 00:31:49.405 "reset": false, 00:31:49.405 "nvme_admin": false, 00:31:49.405 "nvme_io": false, 00:31:49.405 "nvme_io_md": false, 00:31:49.405 "write_zeroes": true, 00:31:49.405 "zcopy": false, 00:31:49.405 "get_zone_info": false, 00:31:49.405 "zone_management": false, 00:31:49.405 "zone_append": false, 00:31:49.405 "compare": false, 00:31:49.405 "compare_and_write": false, 00:31:49.405 "abort": false, 00:31:49.405 "seek_hole": false, 00:31:49.405 "seek_data": false, 00:31:49.405 "copy": false, 00:31:49.405 "nvme_iov_md": false 00:31:49.405 }, 00:31:49.405 "driver_specific": { 00:31:49.405 "compress": { 00:31:49.405 "name": "COMP_lvs0/lv0", 00:31:49.405 "base_bdev_name": "876c5653-588c-4a60-bbf1-6fd37ec85b5a" 00:31:49.405 } 00:31:49.405 } 00:31:49.405 } 00:31:49.405 ] 00:31:49.405 10:39:26 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:31:49.406 10:39:26 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:49.406 I/O targets: 00:31:49.406 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:31:49.406 00:31:49.406 00:31:49.406 CUnit - A unit testing framework for C - Version 2.1-3 00:31:49.406 http://cunit.sourceforge.net/ 00:31:49.406 00:31:49.406 00:31:49.406 Suite: bdevio tests on: COMP_lvs0/lv0 00:31:49.406 Test: blockdev write read block ...passed 00:31:49.406 Test: blockdev write zeroes read block ...passed 00:31:49.406 Test: blockdev write zeroes read no split ...passed 00:31:49.663 Test: blockdev write zeroes read split ...passed 00:31:49.663 Test: blockdev write zeroes read split partial ...passed 00:31:49.663 Test: blockdev reset ...[2024-07-15 10:39:26.635639] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:31:49.663 passed 00:31:49.663 Test: blockdev write read 8 blocks ...passed 00:31:49.663 Test: blockdev write read size > 128k ...passed 00:31:49.663 Test: blockdev write read invalid size ...passed 00:31:49.663 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:49.663 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:49.663 Test: blockdev write read max offset ...passed 00:31:49.663 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:49.663 Test: blockdev writev readv 8 blocks ...passed 00:31:49.663 Test: blockdev writev readv 30 x 1block ...passed 00:31:49.663 Test: blockdev writev readv block ...passed 00:31:49.663 Test: blockdev writev readv size > 128k ...passed 00:31:49.663 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:49.663 Test: blockdev comparev and writev ...passed 00:31:49.663 Test: blockdev nvme passthru rw ...passed 00:31:49.663 Test: blockdev nvme passthru vendor specific ...passed 00:31:49.663 Test: blockdev nvme admin passthru ...passed 00:31:49.663 Test: blockdev copy ...passed 00:31:49.663 00:31:49.663 Run Summary: Type Total Ran Passed Failed Inactive 00:31:49.663 suites 1 1 n/a 0 0 00:31:49.663 tests 23 23 23 0 0 00:31:49.663 asserts 130 130 130 0 n/a 00:31:49.663 00:31:49.663 Elapsed time = 0.109 seconds 00:31:49.663 0 00:31:49.663 10:39:26 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:31:49.663 10:39:26 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:31:49.920 10:39:26 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:31:50.176 10:39:27 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:31:50.176 10:39:27 compress_isal -- compress/compress.sh@62 -- # killprocess 652503 00:31:50.176 10:39:27 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 652503 ']' 00:31:50.176 10:39:27 compress_isal -- common/autotest_common.sh@952 -- # kill -0 652503 00:31:50.176 10:39:27 compress_isal -- common/autotest_common.sh@953 -- # uname 00:31:50.177 10:39:27 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:50.177 10:39:27 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 652503 00:31:50.177 10:39:27 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:50.177 10:39:27 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:50.177 10:39:27 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 652503' 00:31:50.177 killing process with pid 652503 00:31:50.177 10:39:27 compress_isal -- common/autotest_common.sh@967 -- # kill 652503 00:31:50.177 10:39:27 compress_isal -- common/autotest_common.sh@972 -- # wait 652503 00:31:53.451 10:39:30 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:31:53.451 10:39:30 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:31:53.451 00:31:53.451 real 0m47.619s 00:31:53.451 user 1m51.421s 00:31:53.451 sys 0m4.179s 00:31:53.451 10:39:30 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:53.451 10:39:30 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:31:53.451 ************************************ 00:31:53.451 END TEST compress_isal 00:31:53.451 ************************************ 00:31:53.451 10:39:30 -- common/autotest_common.sh@1142 -- # return 0 00:31:53.451 10:39:30 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:31:53.451 10:39:30 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:31:53.451 10:39:30 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:53.451 10:39:30 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:53.451 10:39:30 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:53.451 10:39:30 -- common/autotest_common.sh@10 -- # set +x 00:31:53.451 ************************************ 00:31:53.451 START TEST blockdev_crypto_aesni 00:31:53.451 ************************************ 00:31:53.451 10:39:30 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:31:53.451 * Looking for test storage... 00:31:53.451 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=653789 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:53.451 10:39:30 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 653789 00:31:53.451 10:39:30 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 653789 ']' 00:31:53.451 10:39:30 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:53.451 10:39:30 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:53.451 10:39:30 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:53.451 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:53.451 10:39:30 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:53.451 10:39:30 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:53.451 [2024-07-15 10:39:30.441715] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:31:53.451 [2024-07-15 10:39:30.441770] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid653789 ] 00:31:53.451 [2024-07-15 10:39:30.552528] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:53.708 [2024-07-15 10:39:30.649639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:54.272 10:39:31 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:54.272 10:39:31 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:31:54.272 10:39:31 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:54.272 10:39:31 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:31:54.272 10:39:31 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:31:54.272 10:39:31 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:54.272 10:39:31 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:54.272 [2024-07-15 10:39:31.319782] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:54.272 [2024-07-15 10:39:31.327813] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:54.272 [2024-07-15 10:39:31.335832] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:54.272 [2024-07-15 10:39:31.404277] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:56.801 true 00:31:56.801 true 00:31:56.801 true 00:31:56.801 true 00:31:56.801 Malloc0 00:31:56.801 Malloc1 00:31:56.801 Malloc2 00:31:56.801 Malloc3 00:31:56.801 [2024-07-15 10:39:33.814682] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:56.801 crypto_ram 00:31:56.801 [2024-07-15 10:39:33.822695] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:56.801 crypto_ram2 00:31:56.801 [2024-07-15 10:39:33.830716] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:56.801 crypto_ram3 00:31:56.801 [2024-07-15 10:39:33.838737] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:56.801 crypto_ram4 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:56.801 10:39:33 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:56.801 10:39:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:57.059 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:57.059 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:57.060 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa7c2f78-4b18-51aa-aaab-c2e89161537c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "aa7c2f78-4b18-51aa-aaab-c2e89161537c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "0b8fc9c8-b5a6-5ec8-b44e-5a84793ef54e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0b8fc9c8-b5a6-5ec8-b44e-5a84793ef54e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "9f49c0ce-de1d-55b0-9e12-49c0f6d4805c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9f49c0ce-de1d-55b0-9e12-49c0f6d4805c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "c798c2b3-ea10-5c01-93c0-eac73e6f1783"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c798c2b3-ea10-5c01-93c0-eac73e6f1783",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:57.060 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:57.060 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:57.060 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:57.060 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:57.060 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 653789 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 653789 ']' 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 653789 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 653789 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 653789' 00:31:57.060 killing process with pid 653789 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 653789 00:31:57.060 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 653789 00:31:57.625 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:57.625 10:39:34 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:57.625 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:57.625 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:57.625 10:39:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:57.625 ************************************ 00:31:57.625 START TEST bdev_hello_world 00:31:57.625 ************************************ 00:31:57.625 10:39:34 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:57.625 [2024-07-15 10:39:34.744624] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:31:57.625 [2024-07-15 10:39:34.744684] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654361 ] 00:31:57.883 [2024-07-15 10:39:34.872259] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:57.883 [2024-07-15 10:39:34.968669] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:57.883 [2024-07-15 10:39:34.989937] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:57.883 [2024-07-15 10:39:34.997963] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:57.883 [2024-07-15 10:39:35.005986] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:58.140 [2024-07-15 10:39:35.117090] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:00.670 [2024-07-15 10:39:37.335770] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:00.670 [2024-07-15 10:39:37.335831] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:00.670 [2024-07-15 10:39:37.335846] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:00.670 [2024-07-15 10:39:37.343789] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:00.670 [2024-07-15 10:39:37.343808] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:00.670 [2024-07-15 10:39:37.343820] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:00.670 [2024-07-15 10:39:37.351810] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:00.670 [2024-07-15 10:39:37.351828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:00.670 [2024-07-15 10:39:37.351839] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:00.670 [2024-07-15 10:39:37.359829] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:00.670 [2024-07-15 10:39:37.359847] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:00.670 [2024-07-15 10:39:37.359858] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:00.670 [2024-07-15 10:39:37.435304] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:00.670 [2024-07-15 10:39:37.435347] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:00.670 [2024-07-15 10:39:37.435366] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:00.670 [2024-07-15 10:39:37.436663] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:00.670 [2024-07-15 10:39:37.436737] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:00.670 [2024-07-15 10:39:37.436753] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:00.670 [2024-07-15 10:39:37.436798] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:00.670 00:32:00.670 [2024-07-15 10:39:37.436817] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:00.670 00:32:00.670 real 0m3.175s 00:32:00.670 user 0m2.778s 00:32:00.670 sys 0m0.355s 00:32:00.670 10:39:37 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:00.670 10:39:37 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:00.670 ************************************ 00:32:00.670 END TEST bdev_hello_world 00:32:00.670 ************************************ 00:32:00.928 10:39:37 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:00.928 10:39:37 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:00.928 10:39:37 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:00.928 10:39:37 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:00.928 10:39:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:00.928 ************************************ 00:32:00.928 START TEST bdev_bounds 00:32:00.928 ************************************ 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=654734 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 654734' 00:32:00.928 Process bdevio pid: 654734 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 654734 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 654734 ']' 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:00.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:00.928 10:39:37 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:00.928 [2024-07-15 10:39:38.006248] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:32:00.928 [2024-07-15 10:39:38.006313] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid654734 ] 00:32:01.186 [2024-07-15 10:39:38.133722] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:01.186 [2024-07-15 10:39:38.243853] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:01.186 [2024-07-15 10:39:38.243947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:01.186 [2024-07-15 10:39:38.243957] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:01.186 [2024-07-15 10:39:38.265296] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:01.186 [2024-07-15 10:39:38.273316] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:01.186 [2024-07-15 10:39:38.281339] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:01.444 [2024-07-15 10:39:38.400368] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:03.970 [2024-07-15 10:39:40.630846] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:03.970 [2024-07-15 10:39:40.630933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:03.970 [2024-07-15 10:39:40.630949] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.970 [2024-07-15 10:39:40.638864] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:03.970 [2024-07-15 10:39:40.638885] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:03.970 [2024-07-15 10:39:40.638897] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.970 [2024-07-15 10:39:40.646887] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:03.970 [2024-07-15 10:39:40.646908] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:03.970 [2024-07-15 10:39:40.646921] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.970 [2024-07-15 10:39:40.654909] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:03.970 [2024-07-15 10:39:40.654933] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:03.970 [2024-07-15 10:39:40.654944] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.970 10:39:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:03.970 10:39:40 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:03.970 10:39:40 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:03.970 I/O targets: 00:32:03.970 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:03.970 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:32:03.970 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:03.970 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:32:03.970 00:32:03.970 00:32:03.970 CUnit - A unit testing framework for C - Version 2.1-3 00:32:03.970 http://cunit.sourceforge.net/ 00:32:03.970 00:32:03.970 00:32:03.970 Suite: bdevio tests on: crypto_ram4 00:32:03.970 Test: blockdev write read block ...passed 00:32:03.970 Test: blockdev write zeroes read block ...passed 00:32:03.970 Test: blockdev write zeroes read no split ...passed 00:32:03.970 Test: blockdev write zeroes read split ...passed 00:32:03.970 Test: blockdev write zeroes read split partial ...passed 00:32:03.970 Test: blockdev reset ...passed 00:32:03.970 Test: blockdev write read 8 blocks ...passed 00:32:03.970 Test: blockdev write read size > 128k ...passed 00:32:03.970 Test: blockdev write read invalid size ...passed 00:32:03.970 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:03.970 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:03.970 Test: blockdev write read max offset ...passed 00:32:03.970 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:03.970 Test: blockdev writev readv 8 blocks ...passed 00:32:03.970 Test: blockdev writev readv 30 x 1block ...passed 00:32:03.970 Test: blockdev writev readv block ...passed 00:32:03.970 Test: blockdev writev readv size > 128k ...passed 00:32:03.970 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:03.970 Test: blockdev comparev and writev ...passed 00:32:03.970 Test: blockdev nvme passthru rw ...passed 00:32:03.970 Test: blockdev nvme passthru vendor specific ...passed 00:32:03.970 Test: blockdev nvme admin passthru ...passed 00:32:03.970 Test: blockdev copy ...passed 00:32:03.970 Suite: bdevio tests on: crypto_ram3 00:32:03.970 Test: blockdev write read block ...passed 00:32:03.970 Test: blockdev write zeroes read block ...passed 00:32:03.970 Test: blockdev write zeroes read no split ...passed 00:32:03.970 Test: blockdev write zeroes read split ...passed 00:32:03.970 Test: blockdev write zeroes read split partial ...passed 00:32:03.970 Test: blockdev reset ...passed 00:32:03.970 Test: blockdev write read 8 blocks ...passed 00:32:03.970 Test: blockdev write read size > 128k ...passed 00:32:03.970 Test: blockdev write read invalid size ...passed 00:32:03.970 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:03.970 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:03.970 Test: blockdev write read max offset ...passed 00:32:03.970 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:03.971 Test: blockdev writev readv 8 blocks ...passed 00:32:03.971 Test: blockdev writev readv 30 x 1block ...passed 00:32:03.971 Test: blockdev writev readv block ...passed 00:32:03.971 Test: blockdev writev readv size > 128k ...passed 00:32:03.971 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:03.971 Test: blockdev comparev and writev ...passed 00:32:03.971 Test: blockdev nvme passthru rw ...passed 00:32:03.971 Test: blockdev nvme passthru vendor specific ...passed 00:32:03.971 Test: blockdev nvme admin passthru ...passed 00:32:03.971 Test: blockdev copy ...passed 00:32:03.971 Suite: bdevio tests on: crypto_ram2 00:32:03.971 Test: blockdev write read block ...passed 00:32:03.971 Test: blockdev write zeroes read block ...passed 00:32:03.971 Test: blockdev write zeroes read no split ...passed 00:32:03.971 Test: blockdev write zeroes read split ...passed 00:32:03.971 Test: blockdev write zeroes read split partial ...passed 00:32:03.971 Test: blockdev reset ...passed 00:32:03.971 Test: blockdev write read 8 blocks ...passed 00:32:03.971 Test: blockdev write read size > 128k ...passed 00:32:03.971 Test: blockdev write read invalid size ...passed 00:32:03.971 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:03.971 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:03.971 Test: blockdev write read max offset ...passed 00:32:03.971 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:03.971 Test: blockdev writev readv 8 blocks ...passed 00:32:03.971 Test: blockdev writev readv 30 x 1block ...passed 00:32:03.971 Test: blockdev writev readv block ...passed 00:32:03.971 Test: blockdev writev readv size > 128k ...passed 00:32:03.971 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:03.971 Test: blockdev comparev and writev ...passed 00:32:03.971 Test: blockdev nvme passthru rw ...passed 00:32:03.971 Test: blockdev nvme passthru vendor specific ...passed 00:32:03.971 Test: blockdev nvme admin passthru ...passed 00:32:03.971 Test: blockdev copy ...passed 00:32:03.971 Suite: bdevio tests on: crypto_ram 00:32:03.971 Test: blockdev write read block ...passed 00:32:03.971 Test: blockdev write zeroes read block ...passed 00:32:03.971 Test: blockdev write zeroes read no split ...passed 00:32:03.971 Test: blockdev write zeroes read split ...passed 00:32:03.971 Test: blockdev write zeroes read split partial ...passed 00:32:03.971 Test: blockdev reset ...passed 00:32:03.971 Test: blockdev write read 8 blocks ...passed 00:32:03.971 Test: blockdev write read size > 128k ...passed 00:32:03.971 Test: blockdev write read invalid size ...passed 00:32:03.971 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:03.971 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:03.971 Test: blockdev write read max offset ...passed 00:32:03.971 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:03.971 Test: blockdev writev readv 8 blocks ...passed 00:32:03.971 Test: blockdev writev readv 30 x 1block ...passed 00:32:03.971 Test: blockdev writev readv block ...passed 00:32:03.971 Test: blockdev writev readv size > 128k ...passed 00:32:03.971 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:03.971 Test: blockdev comparev and writev ...passed 00:32:03.971 Test: blockdev nvme passthru rw ...passed 00:32:03.971 Test: blockdev nvme passthru vendor specific ...passed 00:32:03.971 Test: blockdev nvme admin passthru ...passed 00:32:03.971 Test: blockdev copy ...passed 00:32:03.971 00:32:03.971 Run Summary: Type Total Ran Passed Failed Inactive 00:32:03.971 suites 4 4 n/a 0 0 00:32:03.971 tests 92 92 92 0 0 00:32:03.971 asserts 520 520 520 0 n/a 00:32:03.971 00:32:03.971 Elapsed time = 0.538 seconds 00:32:03.971 0 00:32:03.971 10:39:41 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 654734 00:32:03.971 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 654734 ']' 00:32:03.971 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 654734 00:32:03.971 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:03.971 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:03.971 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 654734 00:32:04.229 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:04.229 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:04.229 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 654734' 00:32:04.229 killing process with pid 654734 00:32:04.229 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 654734 00:32:04.229 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 654734 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:04.487 00:32:04.487 real 0m3.643s 00:32:04.487 user 0m10.048s 00:32:04.487 sys 0m0.600s 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:04.487 ************************************ 00:32:04.487 END TEST bdev_bounds 00:32:04.487 ************************************ 00:32:04.487 10:39:41 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:04.487 10:39:41 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:04.487 10:39:41 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:04.487 10:39:41 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:04.487 10:39:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:04.487 ************************************ 00:32:04.487 START TEST bdev_nbd 00:32:04.487 ************************************ 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:04.487 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:04.744 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:04.744 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=655287 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 655287 /var/tmp/spdk-nbd.sock 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 655287 ']' 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:04.745 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:04.745 10:39:41 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:04.745 [2024-07-15 10:39:41.748513] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:32:04.745 [2024-07-15 10:39:41.748580] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:04.745 [2024-07-15 10:39:41.879370] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:05.003 [2024-07-15 10:39:41.983725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:05.003 [2024-07-15 10:39:42.005027] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:05.003 [2024-07-15 10:39:42.013047] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:05.003 [2024-07-15 10:39:42.021067] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:05.003 [2024-07-15 10:39:42.132414] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:07.526 [2024-07-15 10:39:44.362545] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:07.526 [2024-07-15 10:39:44.362607] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:07.526 [2024-07-15 10:39:44.362623] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.526 [2024-07-15 10:39:44.370565] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:07.526 [2024-07-15 10:39:44.370585] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:07.527 [2024-07-15 10:39:44.370597] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.527 [2024-07-15 10:39:44.378584] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:07.527 [2024-07-15 10:39:44.378602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:07.527 [2024-07-15 10:39:44.378614] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.527 [2024-07-15 10:39:44.386605] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:07.527 [2024-07-15 10:39:44.386622] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:07.527 [2024-07-15 10:39:44.386634] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:07.527 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:07.784 1+0 records in 00:32:07.784 1+0 records out 00:32:07.784 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00134808 s, 3.0 MB/s 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:07.784 10:39:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:08.042 1+0 records in 00:32:08.042 1+0 records out 00:32:08.042 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276822 s, 14.8 MB/s 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:08.042 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:08.300 1+0 records in 00:32:08.300 1+0 records out 00:32:08.300 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288021 s, 14.2 MB/s 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:08.300 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:08.559 1+0 records in 00:32:08.559 1+0 records out 00:32:08.559 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000349656 s, 11.7 MB/s 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:08.559 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd0", 00:32:08.830 "bdev_name": "crypto_ram" 00:32:08.830 }, 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd1", 00:32:08.830 "bdev_name": "crypto_ram2" 00:32:08.830 }, 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd2", 00:32:08.830 "bdev_name": "crypto_ram3" 00:32:08.830 }, 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd3", 00:32:08.830 "bdev_name": "crypto_ram4" 00:32:08.830 } 00:32:08.830 ]' 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd0", 00:32:08.830 "bdev_name": "crypto_ram" 00:32:08.830 }, 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd1", 00:32:08.830 "bdev_name": "crypto_ram2" 00:32:08.830 }, 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd2", 00:32:08.830 "bdev_name": "crypto_ram3" 00:32:08.830 }, 00:32:08.830 { 00:32:08.830 "nbd_device": "/dev/nbd3", 00:32:08.830 "bdev_name": "crypto_ram4" 00:32:08.830 } 00:32:08.830 ]' 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:08.830 10:39:45 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:09.141 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@39 -- # sleep 0.1 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i++ )) 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:09.398 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:09.656 10:39:46 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:09.914 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:10.172 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:10.429 /dev/nbd0 00:32:10.429 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:10.429 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:10.429 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:10.429 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:10.430 1+0 records in 00:32:10.430 1+0 records out 00:32:10.430 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307443 s, 13.3 MB/s 00:32:10.430 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.687 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:10.687 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.687 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:10.687 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:10.687 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:10.687 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:10.687 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:32:10.687 /dev/nbd1 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:10.945 1+0 records in 00:32:10.945 1+0 records out 00:32:10.945 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317356 s, 12.9 MB/s 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:10.945 10:39:47 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:32:11.203 /dev/nbd10 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:11.203 1+0 records in 00:32:11.203 1+0 records out 00:32:11.203 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322492 s, 12.7 MB/s 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:11.203 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:32:11.461 /dev/nbd11 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:11.461 1+0 records in 00:32:11.461 1+0 records out 00:32:11.461 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342767 s, 11.9 MB/s 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:11.461 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:11.718 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd0", 00:32:11.718 "bdev_name": "crypto_ram" 00:32:11.718 }, 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd1", 00:32:11.718 "bdev_name": "crypto_ram2" 00:32:11.718 }, 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd10", 00:32:11.718 "bdev_name": "crypto_ram3" 00:32:11.718 }, 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd11", 00:32:11.718 "bdev_name": "crypto_ram4" 00:32:11.718 } 00:32:11.718 ]' 00:32:11.718 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd0", 00:32:11.718 "bdev_name": "crypto_ram" 00:32:11.718 }, 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd1", 00:32:11.718 "bdev_name": "crypto_ram2" 00:32:11.718 }, 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd10", 00:32:11.718 "bdev_name": "crypto_ram3" 00:32:11.718 }, 00:32:11.718 { 00:32:11.718 "nbd_device": "/dev/nbd11", 00:32:11.718 "bdev_name": "crypto_ram4" 00:32:11.718 } 00:32:11.718 ]' 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:11.719 /dev/nbd1 00:32:11.719 /dev/nbd10 00:32:11.719 /dev/nbd11' 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:11.719 /dev/nbd1 00:32:11.719 /dev/nbd10 00:32:11.719 /dev/nbd11' 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:11.719 256+0 records in 00:32:11.719 256+0 records out 00:32:11.719 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105673 s, 99.2 MB/s 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:11.719 256+0 records in 00:32:11.719 256+0 records out 00:32:11.719 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.06199 s, 16.9 MB/s 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:11.719 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:11.976 256+0 records in 00:32:11.976 256+0 records out 00:32:11.976 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0659475 s, 15.9 MB/s 00:32:11.976 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:11.976 10:39:48 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:11.976 256+0 records in 00:32:11.976 256+0 records out 00:32:11.976 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0598771 s, 17.5 MB/s 00:32:11.976 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:11.976 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:11.976 256+0 records in 00:32:11.976 256+0 records out 00:32:11.976 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0573062 s, 18.3 MB/s 00:32:11.976 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:11.976 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:11.976 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:11.976 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:11.976 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:11.977 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:12.234 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:12.235 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:12.235 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:12.492 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:12.492 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:12.492 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:12.493 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:12.493 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:12.493 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:12.493 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:12.493 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:12.493 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:12.493 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:12.750 10:39:49 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:13.008 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:13.008 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:13.265 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:13.523 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:13.781 malloc_lvol_verify 00:32:13.781 10:39:50 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:14.039 b961263f-4e49-4381-85de-17e005614836 00:32:14.039 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:14.298 3254e3ea-d808-433e-91d7-6ac285e24e25 00:32:14.298 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:14.557 /dev/nbd0 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:14.557 mke2fs 1.46.5 (30-Dec-2021) 00:32:14.557 Discarding device blocks: 0/4096 done 00:32:14.557 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:14.557 00:32:14.557 Allocating group tables: 0/1 done 00:32:14.557 Writing inode tables: 0/1 done 00:32:14.557 Creating journal (1024 blocks): done 00:32:14.557 Writing superblocks and filesystem accounting information: 0/1 done 00:32:14.557 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:14.557 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 655287 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 655287 ']' 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 655287 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 655287 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 655287' 00:32:14.815 killing process with pid 655287 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 655287 00:32:14.815 10:39:51 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 655287 00:32:15.074 10:39:52 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:15.074 00:32:15.074 real 0m10.588s 00:32:15.074 user 0m13.824s 00:32:15.074 sys 0m4.191s 00:32:15.074 10:39:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:15.074 10:39:52 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:15.074 ************************************ 00:32:15.074 END TEST bdev_nbd 00:32:15.074 ************************************ 00:32:15.333 10:39:52 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:15.333 10:39:52 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:15.333 10:39:52 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:15.333 10:39:52 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:15.333 10:39:52 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:15.333 ************************************ 00:32:15.333 START TEST bdev_fio 00:32:15.333 ************************************ 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:15.333 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:15.333 ************************************ 00:32:15.333 START TEST bdev_fio_rw_verify 00:32:15.333 ************************************ 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:15.333 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:15.601 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:15.601 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:15.601 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:15.601 10:39:52 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:15.860 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:15.860 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:15.860 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:15.860 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:15.860 fio-3.35 00:32:15.860 Starting 4 threads 00:32:30.742 00:32:30.742 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=657323: Mon Jul 15 10:40:05 2024 00:32:30.742 read: IOPS=18.0k, BW=70.2MiB/s (73.7MB/s)(703MiB/10001msec) 00:32:30.742 slat (usec): min=16, max=348, avg=78.13, stdev=27.38 00:32:30.742 clat (usec): min=16, max=2299, avg=409.40, stdev=211.51 00:32:30.742 lat (usec): min=41, max=2400, avg=487.53, stdev=220.72 00:32:30.742 clat percentiles (usec): 00:32:30.742 | 50.000th=[ 388], 99.000th=[ 906], 99.900th=[ 1336], 99.990th=[ 1696], 00:32:30.742 | 99.999th=[ 2212] 00:32:30.742 write: IOPS=19.8k, BW=77.5MiB/s (81.3MB/s)(757MiB/9757msec); 0 zone resets 00:32:30.742 slat (usec): min=17, max=1591, avg=89.31, stdev=27.14 00:32:30.742 clat (usec): min=28, max=2516, avg=478.55, stdev=239.82 00:32:30.742 lat (usec): min=59, max=2620, avg=567.86, stdev=247.77 00:32:30.742 clat percentiles (usec): 00:32:30.742 | 50.000th=[ 453], 99.000th=[ 1057], 99.900th=[ 1762], 99.990th=[ 2008], 00:32:30.742 | 99.999th=[ 2114] 00:32:30.742 bw ( KiB/s): min=57184, max=115606, per=97.55%, avg=77459.26, stdev=3336.90, samples=76 00:32:30.742 iops : min=14296, max=28901, avg=19364.79, stdev=834.19, samples=76 00:32:30.742 lat (usec) : 20=0.01%, 50=0.01%, 100=2.40%, 250=21.07%, 500=38.18% 00:32:30.742 lat (usec) : 750=28.00%, 1000=9.24% 00:32:30.742 lat (msec) : 2=1.10%, 4=0.01% 00:32:30.742 cpu : usr=99.59%, sys=0.01%, ctx=71, majf=0, minf=296 00:32:30.742 IO depths : 1=9.2%, 2=25.8%, 4=51.7%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:30.742 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:30.742 complete : 0=0.0%, 4=88.6%, 8=11.4%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:30.742 issued rwts: total=179842,193684,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:30.742 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:30.742 00:32:30.742 Run status group 0 (all jobs): 00:32:30.742 READ: bw=70.2MiB/s (73.7MB/s), 70.2MiB/s-70.2MiB/s (73.7MB/s-73.7MB/s), io=703MiB (737MB), run=10001-10001msec 00:32:30.742 WRITE: bw=77.5MiB/s (81.3MB/s), 77.5MiB/s-77.5MiB/s (81.3MB/s-81.3MB/s), io=757MiB (793MB), run=9757-9757msec 00:32:30.742 00:32:30.742 real 0m13.536s 00:32:30.742 user 0m45.780s 00:32:30.742 sys 0m0.498s 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:30.742 ************************************ 00:32:30.742 END TEST bdev_fio_rw_verify 00:32:30.742 ************************************ 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:30.742 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa7c2f78-4b18-51aa-aaab-c2e89161537c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "aa7c2f78-4b18-51aa-aaab-c2e89161537c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "0b8fc9c8-b5a6-5ec8-b44e-5a84793ef54e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0b8fc9c8-b5a6-5ec8-b44e-5a84793ef54e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "9f49c0ce-de1d-55b0-9e12-49c0f6d4805c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9f49c0ce-de1d-55b0-9e12-49c0f6d4805c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "c798c2b3-ea10-5c01-93c0-eac73e6f1783"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c798c2b3-ea10-5c01-93c0-eac73e6f1783",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:30.743 crypto_ram2 00:32:30.743 crypto_ram3 00:32:30.743 crypto_ram4 ]] 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "aa7c2f78-4b18-51aa-aaab-c2e89161537c"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "aa7c2f78-4b18-51aa-aaab-c2e89161537c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "0b8fc9c8-b5a6-5ec8-b44e-5a84793ef54e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0b8fc9c8-b5a6-5ec8-b44e-5a84793ef54e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "9f49c0ce-de1d-55b0-9e12-49c0f6d4805c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "9f49c0ce-de1d-55b0-9e12-49c0f6d4805c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "c798c2b3-ea10-5c01-93c0-eac73e6f1783"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c798c2b3-ea10-5c01-93c0-eac73e6f1783",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:30.743 ************************************ 00:32:30.743 START TEST bdev_fio_trim 00:32:30.743 ************************************ 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:30.743 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:30.744 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:30.744 10:40:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:30.744 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:30.744 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:30.744 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:30.744 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:30.744 fio-3.35 00:32:30.744 Starting 4 threads 00:32:42.945 00:32:42.945 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=659186: Mon Jul 15 10:40:19 2024 00:32:42.945 write: IOPS=44.2k, BW=173MiB/s (181MB/s)(1726MiB/10001msec); 0 zone resets 00:32:42.945 slat (usec): min=11, max=1458, avg=52.50, stdev=41.02 00:32:42.945 clat (usec): min=17, max=2293, avg=228.66, stdev=192.33 00:32:42.945 lat (usec): min=37, max=2484, avg=281.16, stdev=222.50 00:32:42.945 clat percentiles (usec): 00:32:42.945 | 50.000th=[ 174], 99.000th=[ 1057], 99.900th=[ 1221], 99.990th=[ 1319], 00:32:42.945 | 99.999th=[ 1991] 00:32:42.945 bw ( KiB/s): min=173304, max=201985, per=100.00%, avg=176826.16, stdev=1919.21, samples=76 00:32:42.945 iops : min=43326, max=50496, avg=44206.53, stdev=479.80, samples=76 00:32:42.945 trim: IOPS=44.2k, BW=173MiB/s (181MB/s)(1726MiB/10001msec); 0 zone resets 00:32:42.945 slat (usec): min=4, max=359, avg=13.65, stdev= 6.78 00:32:42.945 clat (usec): min=38, max=1962, avg=216.30, stdev=126.16 00:32:42.945 lat (usec): min=46, max=1985, avg=229.95, stdev=129.68 00:32:42.945 clat percentiles (usec): 00:32:42.945 | 50.000th=[ 190], 99.000th=[ 734], 99.900th=[ 824], 99.990th=[ 898], 00:32:42.945 | 99.999th=[ 1385] 00:32:42.945 bw ( KiB/s): min=173304, max=202001, per=100.00%, avg=176827.42, stdev=1920.28, samples=76 00:32:42.945 iops : min=43326, max=50500, avg=44206.84, stdev=480.06, samples=76 00:32:42.945 lat (usec) : 20=0.01%, 50=1.60%, 100=12.42%, 250=60.14%, 500=19.34% 00:32:42.945 lat (usec) : 750=4.49%, 1000=1.25% 00:32:42.945 lat (msec) : 2=0.76%, 4=0.01% 00:32:42.945 cpu : usr=99.62%, sys=0.00%, ctx=67, majf=0, minf=91 00:32:42.945 IO depths : 1=8.6%, 2=26.1%, 4=52.2%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:42.945 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:42.945 complete : 0=0.0%, 4=88.5%, 8=11.5%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:42.945 issued rwts: total=0,441740,441741,0 short=0,0,0,0 dropped=0,0,0,0 00:32:42.945 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:42.945 00:32:42.945 Run status group 0 (all jobs): 00:32:42.945 WRITE: bw=173MiB/s (181MB/s), 173MiB/s-173MiB/s (181MB/s-181MB/s), io=1726MiB (1809MB), run=10001-10001msec 00:32:42.945 TRIM: bw=173MiB/s (181MB/s), 173MiB/s-173MiB/s (181MB/s-181MB/s), io=1726MiB (1809MB), run=10001-10001msec 00:32:42.945 00:32:42.945 real 0m13.541s 00:32:42.945 user 0m45.463s 00:32:42.945 sys 0m0.485s 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:42.945 ************************************ 00:32:42.945 END TEST bdev_fio_trim 00:32:42.945 ************************************ 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:42.945 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:42.945 00:32:42.945 real 0m27.442s 00:32:42.945 user 1m31.437s 00:32:42.945 sys 0m1.180s 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:42.945 ************************************ 00:32:42.945 END TEST bdev_fio 00:32:42.945 ************************************ 00:32:42.945 10:40:19 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:42.945 10:40:19 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:42.945 10:40:19 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:42.945 10:40:19 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:42.945 10:40:19 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:42.945 10:40:19 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:42.945 ************************************ 00:32:42.945 START TEST bdev_verify 00:32:42.945 ************************************ 00:32:42.945 10:40:19 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:42.945 [2024-07-15 10:40:19.945444] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:32:42.945 [2024-07-15 10:40:19.945511] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid660604 ] 00:32:42.945 [2024-07-15 10:40:20.076892] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:43.203 [2024-07-15 10:40:20.187369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:43.203 [2024-07-15 10:40:20.187375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:43.203 [2024-07-15 10:40:20.208733] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:43.203 [2024-07-15 10:40:20.216765] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:43.203 [2024-07-15 10:40:20.224809] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:43.203 [2024-07-15 10:40:20.332419] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:45.740 [2024-07-15 10:40:22.576743] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:45.740 [2024-07-15 10:40:22.576825] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:45.740 [2024-07-15 10:40:22.576840] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.740 [2024-07-15 10:40:22.584758] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:45.740 [2024-07-15 10:40:22.584778] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:45.740 [2024-07-15 10:40:22.584790] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.740 [2024-07-15 10:40:22.592778] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:45.740 [2024-07-15 10:40:22.592796] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:45.740 [2024-07-15 10:40:22.592807] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.740 [2024-07-15 10:40:22.600800] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:45.740 [2024-07-15 10:40:22.600817] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:45.740 [2024-07-15 10:40:22.600829] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.740 Running I/O for 5 seconds... 00:32:51.044 00:32:51.044 Latency(us) 00:32:51.044 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:51.044 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x0 length 0x1000 00:32:51.044 crypto_ram : 5.07 505.29 1.97 0.00 0.00 252788.57 5014.93 175978.41 00:32:51.044 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x1000 length 0x1000 00:32:51.044 crypto_ram : 5.07 505.32 1.97 0.00 0.00 252761.59 5755.77 175978.41 00:32:51.044 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x0 length 0x1000 00:32:51.044 crypto_ram2 : 5.07 504.91 1.97 0.00 0.00 252191.71 5071.92 165948.55 00:32:51.044 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x1000 length 0x1000 00:32:51.044 crypto_ram2 : 5.07 505.03 1.97 0.00 0.00 252175.28 5841.25 165948.55 00:32:51.044 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x0 length 0x1000 00:32:51.044 crypto_ram3 : 5.05 3902.48 15.24 0.00 0.00 32526.99 5128.90 26670.30 00:32:51.044 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x1000 length 0x1000 00:32:51.044 crypto_ram3 : 5.05 3923.59 15.33 0.00 0.00 32349.86 4445.05 26784.28 00:32:51.044 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x0 length 0x1000 00:32:51.044 crypto_ram4 : 5.05 3901.63 15.24 0.00 0.00 32430.94 5328.36 25872.47 00:32:51.044 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:51.044 Verification LBA range: start 0x1000 length 0x1000 00:32:51.044 crypto_ram4 : 5.06 3924.14 15.33 0.00 0.00 32245.40 4473.54 25872.47 00:32:51.044 =================================================================================================================== 00:32:51.044 Total : 17672.37 69.03 0.00 0.00 57615.35 4445.05 175978.41 00:32:51.044 00:32:51.044 real 0m8.336s 00:32:51.044 user 0m15.763s 00:32:51.044 sys 0m0.394s 00:32:51.044 10:40:28 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:51.044 10:40:28 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:51.044 ************************************ 00:32:51.044 END TEST bdev_verify 00:32:51.044 ************************************ 00:32:51.304 10:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:32:51.304 10:40:28 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:51.304 10:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:51.304 10:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:51.304 10:40:28 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:32:51.304 ************************************ 00:32:51.304 START TEST bdev_verify_big_io 00:32:51.304 ************************************ 00:32:51.304 10:40:28 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:51.304 [2024-07-15 10:40:28.362355] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:32:51.304 [2024-07-15 10:40:28.362414] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid661677 ] 00:32:51.304 [2024-07-15 10:40:28.491785] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:51.563 [2024-07-15 10:40:28.590698] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:51.563 [2024-07-15 10:40:28.590704] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:51.563 [2024-07-15 10:40:28.612038] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:32:51.563 [2024-07-15 10:40:28.620064] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:51.563 [2024-07-15 10:40:28.628092] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:51.563 [2024-07-15 10:40:28.736408] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:32:54.100 [2024-07-15 10:40:30.950199] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:32:54.100 [2024-07-15 10:40:30.950290] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:54.100 [2024-07-15 10:40:30.950305] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:54.100 [2024-07-15 10:40:30.958213] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:32:54.100 [2024-07-15 10:40:30.958233] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:54.100 [2024-07-15 10:40:30.958245] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:54.100 [2024-07-15 10:40:30.966237] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:32:54.100 [2024-07-15 10:40:30.966255] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:54.100 [2024-07-15 10:40:30.966266] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:54.100 [2024-07-15 10:40:30.974257] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:32:54.100 [2024-07-15 10:40:30.974275] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:54.100 [2024-07-15 10:40:30.974287] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:54.100 Running I/O for 5 seconds... 00:32:56.659 [2024-07-15 10:40:33.571179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.659 [2024-07-15 10:40:33.572174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.659 [2024-07-15 10:40:33.573404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.659 [2024-07-15 10:40:33.574879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.659 [2024-07-15 10:40:33.577266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.659 [2024-07-15 10:40:33.578779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.659 [2024-07-15 10:40:33.580294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.659 [2024-07-15 10:40:33.581003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.581959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.583553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.585128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.586791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.589737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.591255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.592623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.593016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.594410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.595676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.597184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.598685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.601380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.602892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.603502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.603888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.605982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.607593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.609146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.610585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.613254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.614532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.614923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.615314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.616907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.618418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.619918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.620604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.623239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.623807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.624202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.624668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.626669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.628202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.629628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.630496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.632982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.633381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.633775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.634809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.636728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.638251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.639070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.640552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.642387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.642785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.643212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.644601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.646577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.648034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.648858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.650119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.651652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.652063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.653496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.654782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.656641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.657121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.658569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.660172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.661808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.662576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.663833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.665339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.666911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.660 [2024-07-15 10:40:33.668064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.669332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.670838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.672468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.674029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.675413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.676910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.677685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.679068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.680580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.682092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.684136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.685405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.686910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.688417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.689960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.691241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.692733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.694240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.697603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.699105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.700692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.702323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.704057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.705568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.707075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.708271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.710742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.712264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.713764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.714575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.716190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.717684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.719201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.719613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.722604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.724233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.724849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.726099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.728010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.728408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.728791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.729415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.732114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.733556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.733955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.734341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.736171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.737352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.738128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.739515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.741420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.742825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.742877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.743731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.745548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.746451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.746516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.746901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.748263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.748778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.748836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.750418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.750822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.751230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.661 [2024-07-15 10:40:33.751276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.751656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.752816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.753747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.753796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.754951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.755356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.755749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.755797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.756188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.757425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.759089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.759145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.760718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.761249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.761644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.761689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.762636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.763963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.765127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.765178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.766145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.766670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.767091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.767137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.768710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.771089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.771147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.771535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.771584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.772556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.772613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.773760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.773815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.776477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.776536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.776919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.776969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.778638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.778696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.779867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.779916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.781877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.781946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.782332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.782374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.784167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.784229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.785618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.785668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.787206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.787263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.787644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.787688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.789198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.789256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.789987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.790038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.791601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.791657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.792061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.792107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.794119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.794187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.794995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.662 [2024-07-15 10:40:33.795043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.796691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.796751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.797899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.797953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.799245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.799303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.800617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.800664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.802251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.802309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.803588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.803638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.804526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.804591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.806120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.806170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.807855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.807921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.808321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.808385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.809257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.809312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.809703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.809753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.811626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.811710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.812113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.812166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.813045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.813101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.813489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.813545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.815650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.815714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.816118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.816174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.816193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.816553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.817061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.817115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.817501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.817564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.817585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.818044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.819458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.819857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.819922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.820320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.820696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.820854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.821254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.821300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.821685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.822158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.823541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.823593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.823634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.823675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.824007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.824170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.824216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.824258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.824300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.824615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.825628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.825679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.663 [2024-07-15 10:40:33.825720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.825776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.826168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.826326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.826384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.826429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.826504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.826914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.828864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.829244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.830286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.830338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.830380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.830422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.830763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.830948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.830997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.831039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.831081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.831401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.832476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.832543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.832602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.832655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.833051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.833211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.833259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.833301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.833347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.833825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.834817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.834870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.834913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.834963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.835443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.835594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.835641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.835683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.835724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.836021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.837922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.838312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.839352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.839403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.839445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.839489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.839982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.840140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.840187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.840232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.840274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.840534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.664 [2024-07-15 10:40:33.841555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.841605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.841647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.841688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.841956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.842115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.842160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.842201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.842243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.842735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.843884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.843960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.844001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.844042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.844301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.844456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.844506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.844547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.844589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.845040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.845973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.846838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.847300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.848229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.848294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.848336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.848385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.848767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.848923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.848976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.849017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.849059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.665 [2024-07-15 10:40:33.849320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.987887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.989539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.990373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.991627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.993091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.993487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.994636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.995897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.997815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:33.998754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.000337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.001756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.003290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.003737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.005111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.006629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.008516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.009338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.010601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.011880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.013415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.014549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.926 [2024-07-15 10:40:34.015808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.017086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.018500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.020010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.021355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.022632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.024621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.026028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.027571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.029179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.030272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.031534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.032810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.034753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.037288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.038582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.040096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.041309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.042932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.044223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.045732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.046557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.049552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.050833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.052332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.052946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.054860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.056532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.058050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.058435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.060931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.062422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.063561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.064892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.066587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.068095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.068866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.069258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.071654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.072257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.072306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.073561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.075376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.075778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.075826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.076220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.077399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.078127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.078178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.079332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.079737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.080141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.080187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.080575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.081745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.083151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.083202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.084518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.085024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.085419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.085462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.086131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.087373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.088690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.088742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.090115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.090667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.091070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.091116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.092425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.093670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.094828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.094878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.095616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.096154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.096795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.096846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.097837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.099076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.100717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.100781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.101168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.101672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.102943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.102991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.103917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.105113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.106120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.106184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.106567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.107140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.108518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.108575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.110224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.111486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.111885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.111941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.112327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.112734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.113674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.927 [2024-07-15 10:40:34.113725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.114771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.115958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.116356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.116401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.116800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.117208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.118428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.118479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.118961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.120190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.120587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.120631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.121447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:56.928 [2024-07-15 10:40:34.121912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.192 [2024-07-15 10:40:34.123292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.123343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.124514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.125830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.126234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.126279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.127798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.128311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.129024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.129078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.130513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.132103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.132566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.132613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.133801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.134217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.134984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.135035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.135962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.137226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.138393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.138441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.139377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.139794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.141360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.141415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.142664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.144865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.146276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.146333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.147917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.148423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.149541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.149593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.150996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.152267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.153456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.153506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.154763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.155238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.156428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.156478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.157733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.159028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.159427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.159483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.159866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.160327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.160739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.160788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.161187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.162475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.162880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.162958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.163349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.163844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.164251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.164302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.164689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.166063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.166465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.166524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.166913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.167397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.167795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.167850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.168251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.169806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.170219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.170271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.170663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.171216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.171615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.171684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.172083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.173876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.174284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.174343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.174729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.175334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.175731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.175792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.176198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.178231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.193 [2024-07-15 10:40:34.178636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.178703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.179105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.179727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.180133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.180205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.180599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.182310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.182713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.182763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.183163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.183804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.184212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.184264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.184658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.186257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.186656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.186706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.187103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.187128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.187596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.187758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.188160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.188217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.188616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.188640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.189060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.191798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.191864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.193336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.193395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.193757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.194 [2024-07-15 10:40:34.195308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.194 [2024-07-15 10:40:34.195385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.194 [2024-07-15 10:40:34.196776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.194 [2024-07-15 10:40:34.196833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.198896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.200971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.202965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.203007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.204325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.204377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.204419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.204460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.204805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.204966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.205013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.205054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.205095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.206321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.206376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.206418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.206467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.206730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.206886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.206971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.207017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.194 [2024-07-15 10:40:34.207060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.208355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.208407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.208459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.208501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.208768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.208921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.208982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.209024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.209067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.210261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.210318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.210362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.210404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.210852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.211026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.211075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.211117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.211158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.212990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.213032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.214229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.214921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.214976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.215947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.216245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.216406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.217781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.217831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.218862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.220068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.220466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.220521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.220907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.221275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.221441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.222932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.222988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.223375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.224702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.225113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.225160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.226092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.226359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.226516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.227292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.227340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.228723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.229923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.230589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.230637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.231634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.231950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.195 [2024-07-15 10:40:34.232108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.233494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.233544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.234610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.235786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.237239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.237290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.237695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.237968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.238123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.238516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.238566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.240145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.241458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.243087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.243145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.244221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.244534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.244689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.246122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.246170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.246572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.247728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.248933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.248984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.250364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.250864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.251025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.252143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.252189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.253296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.254504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.255836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.255885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.256725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.257001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.257154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.258058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.258108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.259544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.260874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.262130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.262181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.263617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.264003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.264173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.265729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.265785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.267180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.268406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.269089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.269137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.270094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.270360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.270509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.271208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.271260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.272621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.274963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.275362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.196 [2024-07-15 10:40:34.275408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.276670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.276941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.277096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.278222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.278270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.279536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.281000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.282255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.282302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.283034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.283301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.283454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.284333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.284391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.285914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.287159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.288436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.288484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.288875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.289148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.289301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.290742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.290793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.292352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.295644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.297088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.297136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.298627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.299066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.299226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.300681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.300727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.301261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.305491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.306758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.306806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.308087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.308354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.308510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.309592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.309643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.311093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.315896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.317366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.317417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.319002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.319310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.319464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.320726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.320772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.322058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.324910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.325316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.325364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.326709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.326981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.327136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.328674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.328723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.329997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.333878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.335517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.335587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.336152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.336420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.336576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.337236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.337284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.338541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.341967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.343243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.343290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.344780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.345152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.345313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.346614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.346660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.347581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.350748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.352218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.352267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.353475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.353783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.353940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.355223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.355271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.356769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.360116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.361764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.361814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.197 [2024-07-15 10:40:34.363173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.363448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.363605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.364885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.364938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.366215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.369029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.370620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.370668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.372183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.372448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.372601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.373729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.373776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.375026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.379120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.380587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.380634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.380675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.381029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.381181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.382440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.382487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.198 [2024-07-15 10:40:34.382527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.386214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.387495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.387646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.387793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.388902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.391958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.393211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.394596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.396125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.396395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.459 [2024-07-15 10:40:34.396443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.397195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.398451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.399735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.404183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.405322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.406584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.407865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.408143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.409202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.410773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.412216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.413536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.416343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.417653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.419111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.420705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.420984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.422010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.423282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.424560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.426064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.428742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.429153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.430703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.432108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.432425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.434044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.434670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.435943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.437322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.439462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.440614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.441287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.442415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.442732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.444127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.445641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.446528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.448148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.452034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.453298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.453851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.455125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.455397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.457155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.458643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.459642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.460904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.466257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.466662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.468138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.469136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.469458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.471060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.472147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.472907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.474250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.479465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.480596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.481989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.483224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.483604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.485108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.485507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.486878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.487824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.492037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.493557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.493959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.494006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.494278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.495510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.496163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.497741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.497786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.500460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.500518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.501815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.501864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.502142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.503863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.503918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.504976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.505022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.510066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.510124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.510514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.510557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.510876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.512142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.512198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.512713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.512762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.518112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.518168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.518614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.518662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.518941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.460 [2024-07-15 10:40:34.520561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.520617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.521463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.521512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.525377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.525434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.526641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.526689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.526999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.527835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.527896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.529530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.529583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.533143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.533202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.533630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.533678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.533950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.535633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.535687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.536960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.537005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.541700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.541756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.542961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.543009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.543321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.544182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.544239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.545457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.545500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.549852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.549911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.551238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.551297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.551569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.552852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.552907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.553329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.553384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.557224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.557283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.558440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.558489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.558853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.560312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.560367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.561252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.561299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.566113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.566178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.567667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.567712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.567991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.568683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.568736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.570351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.570405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.575164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.575222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.576029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.576077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.576348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.576975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.577031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.578102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.578146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.583959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.584017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.584800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.584845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.585174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.586232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.586289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.587127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.587175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.589750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.589808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.591182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.591228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.591684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.593405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.593462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.593890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.593940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.596187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.596249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.597717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.597760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.598133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.598630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.598691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.599160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.599210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.601710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.601786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.603345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.603389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.603800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.605132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.461 [2024-07-15 10:40:34.605186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.605571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.605626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.607984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.608048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.608440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.608494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.608870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.610495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.610550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.611279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.611326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.613727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.613786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.615082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.615126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.615442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.615945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.616006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.616559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.616606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.619210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.619283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.620772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.620816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.621192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.622413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.622468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.622854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.622910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.625340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.625400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.625794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.625848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.626324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.627865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.627918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.628711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.628757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.631166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.631224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.632563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.632606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.632944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.633442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.633498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.633975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.634023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.636508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.636577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.638159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.638204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.638590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.639979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.640034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.640425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.640473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.642829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.642893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.643296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.643351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.643737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.645348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.645402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.646113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.646159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.649269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.649328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.650571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.650615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.650917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.651932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.651989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.652787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.462 [2024-07-15 10:40:34.652835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.657761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.658345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.659812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.659861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.660348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.661712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.662874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.663631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.663687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.669334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.669390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.669431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.669477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.669801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.671117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.671171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.671212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.671253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.674721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.674780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.674823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.674864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.675243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.675398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.675444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.675486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.675527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.679887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.683964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.686966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.687012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.725 [2024-07-15 10:40:34.690779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.690821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.693519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.693571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.693611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.693664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.693937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.694093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.694151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.694195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.694241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.697904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.700916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.700975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.701017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.701715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.701998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.702153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.702199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.702251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.702641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.706383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.707062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.707113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.708297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.708637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.708795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.709626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.709674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.710849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.714493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.715767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.715815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.716207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.716492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.716654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.717055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.717103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.718402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.721325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.722572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.722619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.723551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.723865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.724027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.725135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.725182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.726115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.729797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.730247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.730307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.731831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.732266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.732422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.733788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.733844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.735466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.738815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.739728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.739777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.740511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.740786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.740945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.741877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.741931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.742761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.746217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.747828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.747874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.748291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.748565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.748720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.750140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.750202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.751668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.755779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.756498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.756545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.758025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.758496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.758653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.760320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.760369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.726 [2024-07-15 10:40:34.761739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.765243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.766506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.766554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.766949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.767285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.767439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.768739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.768787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.769308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.773745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.775068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.775115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.775861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.776169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.776325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.777673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.777721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.779216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.783092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.783850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.783897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.785118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.785473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.785626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.786506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.786554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.787692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.791479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.792766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.792814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.794109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.794382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.794536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.795403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.795452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.796218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.799469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.800988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.801037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.801841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.802124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.802279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.803903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.803961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.805478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.808489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.809683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.809731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.810995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.811318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.811471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.812978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.813025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.813677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.818042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.818840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.818887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.820262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.820707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.820861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.822434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.822490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.824103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.828373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.829946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.829998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.831255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.831531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.831682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.832088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.832139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.833521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.837693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.838391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.838440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.839715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.840023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.840178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.841681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.841730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.842506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.844417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.845999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.846047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.847560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.847833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.847997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.849204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.727 [2024-07-15 10:40:34.849253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.850519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.855234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.856488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.856536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.857319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.857641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.857798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.859091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.859140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.860650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.864503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.865290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.865340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.866426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.866731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.866889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.867603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.867649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.868916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.872548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.873913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.873970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.875372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.875645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.875797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.876783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.876832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.877494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.880664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.882181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.882230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.882996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.883271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.883423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.885017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.885065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.886591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.889494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.890580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.890629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.891901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.892219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.892372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.893879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.893938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.894527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.898913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.898976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.899018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.899539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.899813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.899975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.900033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.900076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.900569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.904935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.906205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.907482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.908985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.909260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.909413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.910990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.911384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.912948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.917701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.728 [2024-07-15 10:40:34.919157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.920699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.922319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.922596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.923708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.924476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.925739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.926292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.931472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.932742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.934027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.935529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.935888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.937339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.938045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.939164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.939869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.944624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.946217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.947171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.948048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.948325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.949187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.949970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.951146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.952416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.956920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.957545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.958461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.959629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.959904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.960998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.962170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.963177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.964838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.970169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.971328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.972502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.973399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.973677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.974183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.975613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.976008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.977504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.982348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.982819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.984173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.984629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.984902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.986425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.986964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.988214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.989785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.994800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.996474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.997229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.998286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.998561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:34.999645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.000451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.001651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.002278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.006589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.007741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.008403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.008451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.008726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.009231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.010715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.011765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.011816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.015970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.016033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.017232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.017276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.017569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.018752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.991 [2024-07-15 10:40:35.018810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.020155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.020206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.024503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.024562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.024965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.025010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.025297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.026539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.026595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.027073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.027122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.032322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.032379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.033257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.033304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.033671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.035075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.035133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.036471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.036519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.039675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.039733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.041392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.041447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.041822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.042448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.042505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.043763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.043814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.046978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.047036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.047976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.048026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.048299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.049587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.049642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.050586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.050636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.057103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.057167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.058505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.058554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.058923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.060362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.060418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.061991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.062041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.066322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.066386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.066992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.067040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.067355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.068595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.068652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.069133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.069200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.072509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.072568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.074112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.074157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.074466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.076103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.076165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.077133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.077180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.081441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.081501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.082315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.082364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.082674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.083863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.083918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.085529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.085576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.090424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.090483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.090872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.090918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.091201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.091700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.091755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.092159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.092216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.097716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.097775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.098181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.098251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.098525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.099034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.099091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.100301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.100345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.105174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.105239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.105632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.105682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.106180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.106679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.992 [2024-07-15 10:40:35.106750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.107152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.107202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.109687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.109752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.110149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.110211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.110535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.111037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.111092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.111480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.111534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.114107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.114166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.114554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.114612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.115053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.115545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.115604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.116029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.116093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.118751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.118809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.119207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.119263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.119652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.120173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.120229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.120617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.120669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.123353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.123425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.123815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.123870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.124318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.124812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.124867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.126208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.126263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.129634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.129693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.130332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.130378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.130722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.132332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.132388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.133505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.133552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.137396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.137459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.138450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.138500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.138775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.139940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.139997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.140526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.140574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.144953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.145011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.146434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.146487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.146843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.147351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.147406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.148325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.148372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.152629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.152690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.153088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.153141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.153463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.154741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.154796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.155724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.155770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.160153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.160211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.161515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.161563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.161948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.163576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.163639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.165152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.165198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.169087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.169146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.170251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.170295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.170605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.171901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.171964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.172814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.172860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.176231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.176719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.178330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.178377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.178651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.179157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.993 [2024-07-15 10:40:35.179548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.994 [2024-07-15 10:40:35.180220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.994 [2024-07-15 10:40:35.180270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.994 [2024-07-15 10:40:35.184512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.994 [2024-07-15 10:40:35.184568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.994 [2024-07-15 10:40:35.184611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.994 [2024-07-15 10:40:35.184655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:57.994 [2024-07-15 10:40:35.185099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.186612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.186669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.186716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.186779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.189854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.189909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.189960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.190002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.190468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.190618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.190677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.190723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.190768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.192789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.194913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.196220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.196271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.196318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.196364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.196856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.197019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.197066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.197107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.197153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.198368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.198421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.198463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.198505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.198844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.199009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.199055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.199099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.199146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.200512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.200566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.200609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.200651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.200981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.201136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.201181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.201232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.201276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.202628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.202683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.202725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.202766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.203048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.203205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.203256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.203298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.203339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.204664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.204715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.204755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.205932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.206223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.206376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.206423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.206469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.207821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.209167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.209564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.209612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.210123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.210397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.210556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.211129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.211180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.212217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.213907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.215140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.215188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.215607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.215967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.216126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.216678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.216726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.218029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.255 [2024-07-15 10:40:35.219288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.220393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.220444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.220830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.221224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.221379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.222631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.222679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.223952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.225166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.226428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.226476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.227753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.228071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.228224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.228689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.228734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.229125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.230367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.231668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.231718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.232611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.232885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.233042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.234546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.234599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.236028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.237348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.238818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.238876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.240386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.240715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.240865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.242256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.242304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.243300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.244552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.245733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.245785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.247223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.247613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.247764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.248903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.248953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.249850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.251095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.252349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.252396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.254044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.254320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.254473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.255996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.256052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.257640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.258873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.259481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.259528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.260889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.261199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.261349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.262634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.262682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.263963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.268055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.268453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.268501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.268885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.269164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.269317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.270584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.270632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.271939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.275549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.277007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.277064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.277456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.277874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.278031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.279515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.279572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.281066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.282375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.283636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.283685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.284950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.285226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.285379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.286759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.286806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.287209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.288639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.290009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.290062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.291501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.291821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.291980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.293239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.293287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.294568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.295939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.296338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.296385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.297729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.298007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.298162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.299657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.299710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.301138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.302345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.303630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.303678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.304560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.304982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.305134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.305912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.305966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.307214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.308388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.309682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.309730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.311034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.311357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.311508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.312796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.312843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.313285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.314496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.315771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.315818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.317099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.317376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.317530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.319142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.319197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.320836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.322099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.322496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.322541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.323565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.323918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.324075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.325356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.325404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.326689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.327962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.329348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.329396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.330820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.331175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.331329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.331722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.331767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.333132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.334353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.335339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.335390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.336719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.336995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.337147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.338646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.338699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.340128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.341506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.342457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.342508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.343275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.343549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.343706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.344921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.344974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.345375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.346785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.348272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.348331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.349079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.349373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.349523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.350938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.351004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.351389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.352712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.352767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.352809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.353571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.353850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.354011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.354064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.354119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.355753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.357076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.357955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.256 [2024-07-15 10:40:35.358833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.359766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.360045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.360208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.361281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.362214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.363425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.367493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.368706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.369640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.370917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.371194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.371690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.372087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.373355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.374304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.378292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.379935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.380327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.381591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.381993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.383178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.384478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.385422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.386384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.389034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.390059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.391389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.392330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.392646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.393147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.393544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.395132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.396253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.398543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.398946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.399340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.401010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.401332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.402148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.403742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.405000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.405545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.407964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.408552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.410107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.411367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.411682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.412185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.412672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.414006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.415477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.417340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.417736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.418266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.419548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.419824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.420493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.421725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.423347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.423743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.426529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.427222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.428349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.429941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.430329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.430827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.431838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.432634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.433572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.435787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.436551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.438087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.438134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.438410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.439146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.440379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.440780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.440827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.443436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.443494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.444116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.444169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.444443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.446159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.446224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.447703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.447747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.450743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.257 [2024-07-15 10:40:35.450802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.452044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.452092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.452451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.454058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.454113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.455291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.455339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.457363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.457423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.517 [2024-07-15 10:40:35.458717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.458764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.459133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.459629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.459697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.460093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.460149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.461911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.461978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.462368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.462416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.462809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.463315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.463383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.463771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.463832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.465571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.465629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.466025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.466074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.466434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.466939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.466993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.467391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.467433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.469896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.469963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.470366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.470414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.470765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.471271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.471333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.471724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.471780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.473429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.473494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.473882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.473940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.474258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.474749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.474810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.475216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.475267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.477051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.477115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.477512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.477561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.477950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.478446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.478508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.478906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.478970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.481577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.481648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.482039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.482094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.482414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.483161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.483224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.484531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.484579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.487415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.487478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.489105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.489161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.489497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.490550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.490605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.491828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.491878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.494290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.494349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.495119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.495169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.495443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.496804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.496859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.497262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.497311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.500115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.500180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.501174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.501220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.501619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.502904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.502965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.503363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.503407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.505487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.505547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.506990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.507048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.507320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.507813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.518 [2024-07-15 10:40:35.507867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.508257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.508304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.509995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.510052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.511207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.511254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.511530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.512035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.512107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.512494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.512548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.515254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.515319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.516772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.516823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.517206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.517699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.517754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.518645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.518694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.521073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.521131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.522137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.522195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.522639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.523141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.523198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.524741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.524786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.527435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.527500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.527890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.527940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.528360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.529364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.529420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.530587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.530635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.532889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.532951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.533336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.533378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.533723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.535378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.535452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.536998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.537042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.538581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.538639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.539031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.539073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.539385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.540683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.540738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.541433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.541487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.543095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.543151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.543538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.543585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.543858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.545523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.545578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.546403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.546452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.548053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.548110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.549344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.549391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.549696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.550511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.550571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.552136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.552182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.553895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.553958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.555126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.555173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.555444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.556376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.556432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.557591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.557640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.560933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.562121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.563626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.563675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.564026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.565343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.566992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.568581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.568628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.571062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.571122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.571164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.571206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.571526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.572411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.519 [2024-07-15 10:40:35.572466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.572509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.572551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.574787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.576894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.578634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.578698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.578742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.578796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.579160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.579318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.579364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.579408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.579464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.580856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.580907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.580964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.581022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.581424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.581576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.581621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.581667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.581712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.583852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.585750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.586934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.586986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.587033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.587077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.587488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.587634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.587680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.587738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.587782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.589075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.589133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.589181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.590683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.590964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.591117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.591163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.591204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.592764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.593976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.594587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.594633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.595024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.595316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.595469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.596736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.596782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.598074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.599205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.600561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.600609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.602007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.602281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.602435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.602829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.602875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.603270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.604456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.605971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.606018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.606745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.520 [2024-07-15 10:40:35.607025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.607184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.608722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.608782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.610269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.611485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.612694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.612742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.614012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.614342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.614494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.615991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.616040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.616755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.617915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.619373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.619420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.619804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.620281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.620433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.622076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.622131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.623758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.625021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.626292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.626339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.627619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.627895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.628050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.629099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.629160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.629544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.630828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.632299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.632355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.633914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.634270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.634421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.635690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.635738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.637011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.638334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.638732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.638780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.640108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.640384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.640538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.642048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.642097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.643301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.644498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.646013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.646062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.646706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.647112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.647265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.647995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.648044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.649309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.650477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.651754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.651802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.653135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.653453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.653604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.655125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.655175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.655567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.656785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.658082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.658131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.659645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.659970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.660122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.661676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.661724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.663222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.664487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.664886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.664939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.666170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.666497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.666652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.667940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.667988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.669480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.670736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.672316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.672383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.673838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.674247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.674401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.674795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.674850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.676504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.677693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.521 [2024-07-15 10:40:35.678314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.678362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.679621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.679895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.680052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.681560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.681608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.682599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.684202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.685405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.685456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.685956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.686235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.686391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.688048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.688106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.688493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.689749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.690967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.691018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.692361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.692739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.692890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.693667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.693719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.694108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.695395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.695881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.695976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.697002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.697278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.697434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.697826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.697871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.698257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.699483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.700859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.700909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.701973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.702292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.702447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.702837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.702881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.703590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.704742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.705532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.705581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.706508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.706782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.706945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.708045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.708102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.708482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.709898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.710675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.710737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.712271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.712572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.712730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.713138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.713185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.522 [2024-07-15 10:40:35.713564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.714729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.715365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.715416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.716586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.716859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.717024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.717433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.717478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.717859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.719076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.720428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.720479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.721766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.722148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.722302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.722692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.722736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.723425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.724702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.724758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.724800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.725964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.726245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.726398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.726450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.726492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.726878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.728134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.728535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.730080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.731579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.731967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.732122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.732514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.733252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.734431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.737245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.737642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.738033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.738795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.739139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.740549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.741406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.742572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.743767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.746432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.747550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.748576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.749745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.750048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.750544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.750944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.752279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.753438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.755724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.756123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.756508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.757966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.758382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.759820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.760268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.760658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.761049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.763747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.764784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.784 [2024-07-15 10:40:35.765550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.765944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.766322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.768032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.769610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.771290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.771826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.773294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.773690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.774088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.775710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.776041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.777642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.778047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.778431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.779371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.781958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.782988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.784248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.785518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.785797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.786945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.787348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.787742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.789310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.791789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.792564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.792957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.793346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.793715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.794229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.794623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.795017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.795409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.797203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.797604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.798005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.798053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.798443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.798948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.799349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.799741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.799788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.801607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.801666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.802056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.802101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.802424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.802921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.802994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.803381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.803426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.805219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.805281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.805664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.805708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.806082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.806578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.806640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.807034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.807078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.808894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.808961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.809345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.809390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.809734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.810237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.810300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.810689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.810733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.812543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.812600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.812993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.813038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.813382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.814911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.814970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.815485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.815533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.817775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.817834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.819003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.819052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.819485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.820992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.821053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.822417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.822464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.825415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.825477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.827089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.827136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.827471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.828745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.828800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.829995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.830040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.832760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.832817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.785 [2024-07-15 10:40:35.833656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.833718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.833999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.835307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.835362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.835755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.835800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.838544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.838609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.839259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.839308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.839651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.841049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.841103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.841494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.841549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.843564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.843622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.844613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.844665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.845160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.845658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.845713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.847039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.847093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.849674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.849731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.850139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.850185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.850555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.851798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.851854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.852779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.852829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.855155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.855221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.855609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.855652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.856063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.857513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.857573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.859221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.859276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.860811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.860870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.861268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.861322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.861600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.862622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.862678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.863611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.863661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.865289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.865353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.865738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.865785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.866064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.867502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.867557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.868184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.868234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.869888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.869953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.870815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.870860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.871228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.872646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.872702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.874021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.874068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.875758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.875816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.877362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.877415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.877735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.878466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.878522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.879812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.879861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.881765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.881824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.882844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.882892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.883174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.884315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.884371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.885320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.885369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.888184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.888243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.889372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.889420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.889774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.891393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.891461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.892984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.893038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.895671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.895728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.896904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.786 [2024-07-15 10:40:35.896956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.897230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.898521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.898577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.899838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.899883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.901727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.901790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.903215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.903271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.903543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.904229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.904285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.905293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.905342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.906904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.906971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.908464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.908510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.908783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.909526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.909582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.911071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.911117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.913841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.913900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.914297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.914346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.914628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.915134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.915205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.915596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.915644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.917710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.917768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.918557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.918606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.919029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.919581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.919636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.920803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.920847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.922389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.923830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.925338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.925386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.925785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.927204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.928454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.928870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.928917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.931457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.931515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.931557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.931605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.931881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.933060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.933114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.933156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.933197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.934427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.934477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.934518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.934559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.934986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.935138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.935185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.935233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.935280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.936589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.936646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.936690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.936736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.937013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.937168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.937213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.937254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.937295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.938515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.938571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.938620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.938676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.938955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.939108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.939158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.939199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.939240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.940635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.940686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.940733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.940779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.941057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.941209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.941254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.941296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.941341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.942537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.942587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.942629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.942677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.787 [2024-07-15 10:40:35.942955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.943107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.943152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.943193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.943235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.944406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.944458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.944526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.944569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.944961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.945115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.945160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.945202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.945243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.946376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.946427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.946469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.946510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.946824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.946985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.947032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.947073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.947115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.948313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.948365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.948407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.949699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.949978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.950148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.950198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.950248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.950640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.951809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.953465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.953519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.955153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.955444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.955599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.956872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.956919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.958200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.959570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.959978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.960030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.961357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.961632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.961789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.963286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.963337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.964765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.965937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.967224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.967272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.968304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.968758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.968914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.969326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.969373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.970628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.971803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.972687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.972736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.973982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.974294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.974444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.975733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.975783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.976624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:58.788 [2024-07-15 10:40:35.978241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.979833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.979889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.981550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.981823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.981984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.983063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.983111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.984388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.985631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.986040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.986092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.986615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.986888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.987051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.988348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.988396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.989679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.990896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.992163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.992210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.993489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.993844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.994009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.994405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.994455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.995195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.996339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.997967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.998024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.999236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.999558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:35.999708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.000976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.001024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.002296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.004118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.005489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.005536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.006879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.007158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.007313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.008790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.008838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.010250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.011584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.012399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.012450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.012836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.013173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.013327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.014673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.014729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.016337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.017582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.017992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.018043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.018427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.018749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.018899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.019847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.019897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.020783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.050 [2024-07-15 10:40:36.022087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.022480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.022524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.022957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.023229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.023384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.024972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.025030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.025830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.027128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.027526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.027570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.028853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.029235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.029385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.030234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.030285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.031881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.033335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.033745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.033794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.035095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.035370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.035529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.036212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.036263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.037522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.038817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.040187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.040235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.041269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.041617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.041773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.043317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.043365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.044807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.046332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.047390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.047441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.048787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.049110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.049262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.050216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.050266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.051209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.052751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.053906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.053960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.054435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.054709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.054883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.056454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.056510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.056899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.058129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.059578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.059638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.060749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.061110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.061267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.062273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.062324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.062711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.064198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.064887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.064945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.066378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.066699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.066853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.067256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.067307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.067689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.068882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.069758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.069807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.070741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.071019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.071176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.071571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.071621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.072029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.073338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.074877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.074938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.076160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.076581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.076732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.077135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.077184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.077888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.079181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.080324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.080378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.081808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.082187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.082340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.082733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.082782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.084064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.051 [2024-07-15 10:40:36.085375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.085430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.085471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.086569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.086900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.087062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.087114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.087156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.087541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.088860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.089618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.090870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.092107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.092424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.092581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.092986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.093375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.094982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.096721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.097259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.098538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.099982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.100258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.101056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.102310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.103585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.104944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.107840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.109417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.109822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.111349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.111624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.112125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.112525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.113532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.114779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.117195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.118454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.119726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.121005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.121298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.121792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.122197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.122938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.124430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.127147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.127547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.127950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.129043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.129414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.130442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.131285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.131682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.132074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.133902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.134306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.134698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.135093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.135539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.136041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.136449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.136839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.137232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.138955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.139356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.139750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.140147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.140563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.141064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.141474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.141867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.142264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.143954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.144356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.144746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.145147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.145650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.146154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.146550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.146949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.147335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.149061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.149464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.149860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.149908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.150248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.150745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.152211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.153819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.153875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.155539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.155599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.155994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.156042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.156353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.157625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.157679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.158586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.158631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.052 [2024-07-15 10:40:36.160299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.160358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.160744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.160793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.161074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.162583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.162648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.163128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.163177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.164789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.164851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.165676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.165725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.166066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.167255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.167311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.168592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.168641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.170521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.170600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.172082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.172137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.172490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.174092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.174155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.174547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.174595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.177142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.177200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.177778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.177825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.178147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.179842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.179903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.180296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.180347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.182893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.182960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.184367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.184413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.184764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.185654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.185715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.186110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.186158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.187870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.187936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.189045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.189095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.189366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.189859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.189917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.190314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.190364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.192596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.192655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.193606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.193655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.193992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.194487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.194547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.194939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.194986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.197767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.197835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.199326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.199382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.199776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.200295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.200351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.201422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.201467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.203650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.203708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.204622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.204686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.205139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.205636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.205704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.207187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.207247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.210172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.210239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.210628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.210676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.211059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.212265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.212322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.213263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.213313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.215480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.215546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.215945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.215996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.216355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.217719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.217778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.219314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.219370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.220969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.053 [2024-07-15 10:40:36.221030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.221417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.221465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.221740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.222781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.222837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.224339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.224387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.227201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.227268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.227659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.227709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.228062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.228842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.228897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.230166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.230213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.232490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.232562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.232966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.233022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.233460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.235078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.235153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.235540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.235588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.237454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.237513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.238808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.238859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.239139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.240693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.240754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.241835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.241885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.243551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.243628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.244298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.244355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.244674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.245181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.054 [2024-07-15 10:40:36.245240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.245630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.245679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.247965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.248026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.249326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.249376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.249713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.251343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.251401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.251791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.251839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.253518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.253579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.253978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.254030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.254373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.255775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.255841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.256507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.256565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.258242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.258301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.258831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.258877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.259168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.260611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.260668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.262165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.262212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.264815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.264874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.315 [2024-07-15 10:40:36.266374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.266420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.266793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.267301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.267357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.268296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.268342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.270630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.270687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.272296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.272351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.272623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.274146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.274207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.275809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.275855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.278362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.279661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.281168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.281218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.281582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.283304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.284881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.286330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.286382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.288441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.288499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.288540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.288581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.288898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.290302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.290358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.290399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.290440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.291759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.291811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.291851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.291892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.292214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.292368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.292413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.292455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.292506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.293814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.293868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.293910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.293958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.294237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.294390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.294442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.294497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.294542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.295740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.295794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.295836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.295878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.296163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.296317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.296365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.296411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.296452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.297665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.297718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.297775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.297835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.298260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.298412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.298458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.298499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.298544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.299732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.299784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.299826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.299867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.300150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.300303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.300348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.300399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.300448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.301695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.301747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.301790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.301831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.302115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.302268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.302321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.302367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.302409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.303715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.303767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.303808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.303857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.304139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.304294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.304342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.304388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.304429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.305623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.305676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.305725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.307367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.307648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.307805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.307861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.307902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.309122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.310527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.311806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.311862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.313151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.313427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.313586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.314345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.314394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.315659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.316923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.317340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.317412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.317803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.318088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.318244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.319882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.319934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.321489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.322746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.324034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.316 [2024-07-15 10:40:36.324083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.325571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.325901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.326066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.326460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.326510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.327046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.328247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.329693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.329743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.330963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.331288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.331445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.332727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.332777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.334266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.335825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.337327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.337387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.338920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.339304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.339458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.340624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.340675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.341780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.343365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.344538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.344604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.345192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.345471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.345628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.347058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.347112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.347503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.348775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.350333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.350385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.351134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.351451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.351604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.352753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.352801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.353199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.354700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.355412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.355472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.357116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.357396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.357548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.357954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.358006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.358393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.359620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.360258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.360311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.361585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.361862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.362028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.362438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.362501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.362894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.364230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.365783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.365832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.367278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.367701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.367855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.368256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.368306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.369199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.370536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.371484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.371536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.372549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.372972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.373125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.373526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.373573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.375171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.376461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.377833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.377885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.378284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.378630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.378785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.379617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.379666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.380592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.381898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.383059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.383111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.383508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.384001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.384156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.385794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.385851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.387215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.388562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.388970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.389022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.389406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.389721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.389872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.390819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.390868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.392051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.393307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.393710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.393762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.394177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.394454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.394607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.395776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.395829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.396274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.397610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.398024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.398077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.398849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.399177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.399333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.400685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.400739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.401950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.403334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.403742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.403791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.405455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.405776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.405940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.406429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.406479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.407739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.317 [2024-07-15 10:40:36.409162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.409905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.409967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.410842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.411123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.411280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.412425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.412473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.413316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.414691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.415550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.415602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.416329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.416707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.416863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.417524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.417576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.417969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.419300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.420918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.420985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.421863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.422192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.422345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.423623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.423672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.424557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.426231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.427737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.427785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.428360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.428672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.428823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.430313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.430364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.430757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.432017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.433294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.433342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.434822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.435125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.435283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.436855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.436912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.438550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.439867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.439923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.439975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.440364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.440677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.440828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.440878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.440919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.441885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.443322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.443756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.445143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.446656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.446949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.447104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.448510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.449515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.450780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.452376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.452774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.453794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.455017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.455319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.456386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.457596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.458587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.458991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.460882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.461294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.461687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.462088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.462444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.462950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.463349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.463744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.464335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.466095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.466494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.466889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.467292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.467690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.468204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.468602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.469008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.469402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.471247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.471652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.472058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.472455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.472839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.473345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.473740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.474150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.474544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.477550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.478263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.479325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.480086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.480559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.481066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.482577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.483904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.484411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.486137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.486739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.487968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.489585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.489978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.491272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.492895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.493298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.493688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.318 [2024-07-15 10:40:36.496229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.319 [2024-07-15 10:40:36.496628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.319 [2024-07-15 10:40:36.497027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.319 [2024-07-15 10:40:36.497420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.319 [2024-07-15 10:40:36.497698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.319 [2024-07-15 10:40:36.498838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.319 [2024-07-15 10:40:36.499588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:32:59.887 00:32:59.887 Latency(us) 00:32:59.887 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:59.887 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x0 length 0x100 00:32:59.887 crypto_ram : 5.77 44.37 2.77 0.00 0.00 2799060.37 66561.78 2567643.49 00:32:59.887 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x100 length 0x100 00:32:59.887 crypto_ram : 5.74 44.57 2.79 0.00 0.00 2778956.13 72944.42 2494699.07 00:32:59.887 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x0 length 0x100 00:32:59.887 crypto_ram2 : 5.77 44.36 2.77 0.00 0.00 2704052.76 65649.98 2582232.38 00:32:59.887 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x100 length 0x100 00:32:59.887 crypto_ram2 : 5.75 44.56 2.78 0.00 0.00 2684456.07 72488.51 2480110.19 00:32:59.887 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x0 length 0x100 00:32:59.887 crypto_ram3 : 5.58 288.00 18.00 0.00 0.00 399006.19 51061.09 561672.01 00:32:59.887 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x100 length 0x100 00:32:59.887 crypto_ram3 : 5.56 298.40 18.65 0.00 0.00 385359.43 8833.11 558024.79 00:32:59.887 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x0 length 0x100 00:32:59.887 crypto_ram4 : 5.64 301.68 18.86 0.00 0.00 370694.36 13392.14 492374.82 00:32:59.887 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:59.887 Verification LBA range: start 0x100 length 0x100 00:32:59.887 crypto_ram4 : 5.65 315.31 19.71 0.00 0.00 355188.28 23251.03 485080.38 00:32:59.887 =================================================================================================================== 00:32:59.887 Total : 1381.26 86.33 0.00 0.00 688586.52 8833.11 2582232.38 00:33:00.146 00:33:00.146 real 0m8.963s 00:33:00.146 user 0m17.001s 00:33:00.146 sys 0m0.428s 00:33:00.146 10:40:37 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:00.146 10:40:37 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:00.146 ************************************ 00:33:00.146 END TEST bdev_verify_big_io 00:33:00.146 ************************************ 00:33:00.146 10:40:37 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:00.147 10:40:37 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:00.147 10:40:37 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:00.147 10:40:37 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:00.147 10:40:37 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:00.406 ************************************ 00:33:00.406 START TEST bdev_write_zeroes 00:33:00.406 ************************************ 00:33:00.406 10:40:37 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:00.406 [2024-07-15 10:40:37.407359] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:00.406 [2024-07-15 10:40:37.407420] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid662741 ] 00:33:00.406 [2024-07-15 10:40:37.533761] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:00.665 [2024-07-15 10:40:37.632554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:00.665 [2024-07-15 10:40:37.653811] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:33:00.665 [2024-07-15 10:40:37.661838] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:00.665 [2024-07-15 10:40:37.669858] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:00.665 [2024-07-15 10:40:37.768686] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:33:03.201 [2024-07-15 10:40:40.004028] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:33:03.201 [2024-07-15 10:40:40.004098] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:03.201 [2024-07-15 10:40:40.004114] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.201 [2024-07-15 10:40:40.012049] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:33:03.201 [2024-07-15 10:40:40.012069] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:03.201 [2024-07-15 10:40:40.012082] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.201 [2024-07-15 10:40:40.020071] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:33:03.201 [2024-07-15 10:40:40.020090] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:03.201 [2024-07-15 10:40:40.020102] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.201 [2024-07-15 10:40:40.028089] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:33:03.201 [2024-07-15 10:40:40.028106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:03.201 [2024-07-15 10:40:40.028117] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.201 Running I/O for 1 seconds... 00:33:04.140 00:33:04.140 Latency(us) 00:33:04.140 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:04.140 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:04.140 crypto_ram : 1.03 1965.41 7.68 0.00 0.00 64618.36 5442.34 77503.44 00:33:04.140 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:04.140 crypto_ram2 : 1.03 1978.68 7.73 0.00 0.00 63903.91 5385.35 72032.61 00:33:04.140 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:04.140 crypto_ram3 : 1.02 15115.62 59.05 0.00 0.00 8339.77 2464.72 10770.70 00:33:04.140 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:04.140 crypto_ram4 : 1.02 15152.76 59.19 0.00 0.00 8293.72 2464.72 8662.15 00:33:04.140 =================================================================================================================== 00:33:04.140 Total : 34212.48 133.64 0.00 0.00 14794.08 2464.72 77503.44 00:33:04.399 00:33:04.399 real 0m4.192s 00:33:04.399 user 0m3.750s 00:33:04.399 sys 0m0.401s 00:33:04.399 10:40:41 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:04.399 10:40:41 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:04.399 ************************************ 00:33:04.399 END TEST bdev_write_zeroes 00:33:04.399 ************************************ 00:33:04.399 10:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:33:04.399 10:40:41 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.399 10:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:04.399 10:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:04.399 10:40:41 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:04.683 ************************************ 00:33:04.683 START TEST bdev_json_nonenclosed 00:33:04.683 ************************************ 00:33:04.683 10:40:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.683 [2024-07-15 10:40:41.664477] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:04.683 [2024-07-15 10:40:41.664536] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663289 ] 00:33:04.683 [2024-07-15 10:40:41.789966] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:04.942 [2024-07-15 10:40:41.892673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:04.942 [2024-07-15 10:40:41.892734] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:04.942 [2024-07-15 10:40:41.892755] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:04.942 [2024-07-15 10:40:41.892768] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:04.942 00:33:04.942 real 0m0.393s 00:33:04.942 user 0m0.237s 00:33:04.943 sys 0m0.153s 00:33:04.943 10:40:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:04.943 10:40:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:04.943 10:40:41 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:04.943 ************************************ 00:33:04.943 END TEST bdev_json_nonenclosed 00:33:04.943 ************************************ 00:33:04.943 10:40:42 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:04.943 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:33:04.943 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.943 10:40:42 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:04.943 10:40:42 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:04.943 10:40:42 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:04.943 ************************************ 00:33:04.943 START TEST bdev_json_nonarray 00:33:04.943 ************************************ 00:33:04.943 10:40:42 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:04.943 [2024-07-15 10:40:42.140267] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:04.943 [2024-07-15 10:40:42.140335] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663422 ] 00:33:05.202 [2024-07-15 10:40:42.269205] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:05.202 [2024-07-15 10:40:42.365526] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:05.202 [2024-07-15 10:40:42.365602] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:05.202 [2024-07-15 10:40:42.365623] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:05.202 [2024-07-15 10:40:42.365636] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:05.462 00:33:05.462 real 0m0.390s 00:33:05.462 user 0m0.241s 00:33:05.462 sys 0m0.146s 00:33:05.462 10:40:42 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:05.462 10:40:42 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:05.462 10:40:42 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:05.462 ************************************ 00:33:05.462 END TEST bdev_json_nonarray 00:33:05.462 ************************************ 00:33:05.462 10:40:42 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:33:05.462 10:40:42 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:33:05.462 00:33:05.462 real 1m12.268s 00:33:05.462 user 2m39.649s 00:33:05.462 sys 0m9.161s 00:33:05.462 10:40:42 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:05.462 10:40:42 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:33:05.462 ************************************ 00:33:05.462 END TEST blockdev_crypto_aesni 00:33:05.462 ************************************ 00:33:05.462 10:40:42 -- common/autotest_common.sh@1142 -- # return 0 00:33:05.462 10:40:42 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:05.462 10:40:42 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:05.462 10:40:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:05.462 10:40:42 -- common/autotest_common.sh@10 -- # set +x 00:33:05.462 ************************************ 00:33:05.462 START TEST blockdev_crypto_sw 00:33:05.462 ************************************ 00:33:05.462 10:40:42 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:33:05.722 * Looking for test storage... 00:33:05.722 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=663541 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 663541 00:33:05.722 10:40:42 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 663541 ']' 00:33:05.722 10:40:42 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:05.722 10:40:42 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:05.722 10:40:42 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:05.722 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:05.722 10:40:42 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:05.722 10:40:42 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:33:05.722 10:40:42 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:05.722 [2024-07-15 10:40:42.806773] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:05.722 [2024-07-15 10:40:42.806846] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663541 ] 00:33:05.982 [2024-07-15 10:40:42.935479] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:05.982 [2024-07-15 10:40:43.038709] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:06.548 10:40:43 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:06.548 10:40:43 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:33:06.548 10:40:43 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:33:06.548 10:40:43 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:33:06.548 10:40:43 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:33:06.548 10:40:43 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.548 10:40:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.856 Malloc0 00:33:06.856 Malloc1 00:33:06.856 true 00:33:06.856 true 00:33:06.856 true 00:33:06.856 [2024-07-15 10:40:43.982768] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:06.856 crypto_ram 00:33:06.856 [2024-07-15 10:40:43.990796] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:06.856 crypto_ram2 00:33:06.856 [2024-07-15 10:40:43.998821] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:06.856 crypto_ram3 00:33:06.856 [ 00:33:06.856 { 00:33:06.856 "name": "Malloc1", 00:33:06.856 "aliases": [ 00:33:06.856 "7822bde2-eb53-41bb-831f-a95b9536ab30" 00:33:06.856 ], 00:33:06.856 "product_name": "Malloc disk", 00:33:06.856 "block_size": 4096, 00:33:06.856 "num_blocks": 4096, 00:33:06.856 "uuid": "7822bde2-eb53-41bb-831f-a95b9536ab30", 00:33:06.856 "assigned_rate_limits": { 00:33:06.856 "rw_ios_per_sec": 0, 00:33:06.856 "rw_mbytes_per_sec": 0, 00:33:06.856 "r_mbytes_per_sec": 0, 00:33:06.856 "w_mbytes_per_sec": 0 00:33:06.856 }, 00:33:06.856 "claimed": true, 00:33:06.856 "claim_type": "exclusive_write", 00:33:06.856 "zoned": false, 00:33:06.856 "supported_io_types": { 00:33:06.856 "read": true, 00:33:06.856 "write": true, 00:33:06.856 "unmap": true, 00:33:06.856 "flush": true, 00:33:06.856 "reset": true, 00:33:06.856 "nvme_admin": false, 00:33:06.856 "nvme_io": false, 00:33:06.856 "nvme_io_md": false, 00:33:06.856 "write_zeroes": true, 00:33:06.856 "zcopy": true, 00:33:06.856 "get_zone_info": false, 00:33:06.856 "zone_management": false, 00:33:06.856 "zone_append": false, 00:33:06.856 "compare": false, 00:33:06.856 "compare_and_write": false, 00:33:06.856 "abort": true, 00:33:06.856 "seek_hole": false, 00:33:06.856 "seek_data": false, 00:33:06.856 "copy": true, 00:33:06.856 "nvme_iov_md": false 00:33:06.856 }, 00:33:06.856 "memory_domains": [ 00:33:06.856 { 00:33:06.856 "dma_device_id": "system", 00:33:06.856 "dma_device_type": 1 00:33:06.856 }, 00:33:06.856 { 00:33:06.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:06.856 "dma_device_type": 2 00:33:06.856 } 00:33:06.856 ], 00:33:06.856 "driver_specific": {} 00:33:06.856 } 00:33:06.856 ] 00:33:06.856 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.856 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:33:06.856 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.856 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:06.857 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:06.857 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:33:06.857 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:33:06.857 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:06.857 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:07.131 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:07.131 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:07.131 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:33:07.131 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:33:07.131 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:07.131 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:07.131 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:33:07.132 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "fa1055fc-cd08-565a-a574-8b1dc49c387d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fa1055fc-cd08-565a-a574-8b1dc49c387d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a9331ac4-2116-57b2-b8cf-37b6b1819cd8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "a9331ac4-2116-57b2-b8cf-37b6b1819cd8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:07.132 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:33:07.132 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:33:07.132 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:33:07.132 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:33:07.132 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 663541 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 663541 ']' 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 663541 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 663541 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 663541' 00:33:07.132 killing process with pid 663541 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 663541 00:33:07.132 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 663541 00:33:07.700 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:07.700 10:40:44 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:07.700 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:33:07.700 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:07.700 10:40:44 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:07.700 ************************************ 00:33:07.700 START TEST bdev_hello_world 00:33:07.700 ************************************ 00:33:07.700 10:40:44 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:33:07.700 [2024-07-15 10:40:44.662954] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:07.700 [2024-07-15 10:40:44.662999] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663756 ] 00:33:07.700 [2024-07-15 10:40:44.777015] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:07.700 [2024-07-15 10:40:44.875214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:07.960 [2024-07-15 10:40:45.058516] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:07.960 [2024-07-15 10:40:45.058592] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:07.960 [2024-07-15 10:40:45.058617] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.960 [2024-07-15 10:40:45.066535] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:07.960 [2024-07-15 10:40:45.066554] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:07.960 [2024-07-15 10:40:45.066565] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.960 [2024-07-15 10:40:45.074555] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:07.960 [2024-07-15 10:40:45.074572] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:07.960 [2024-07-15 10:40:45.074583] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:07.960 [2024-07-15 10:40:45.116300] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:33:07.960 [2024-07-15 10:40:45.116336] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:33:07.960 [2024-07-15 10:40:45.116354] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:33:07.960 [2024-07-15 10:40:45.118428] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:33:07.960 [2024-07-15 10:40:45.118496] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:33:07.960 [2024-07-15 10:40:45.118511] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:33:07.960 [2024-07-15 10:40:45.118545] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:33:07.960 00:33:07.960 [2024-07-15 10:40:45.118563] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:33:08.220 00:33:08.220 real 0m0.719s 00:33:08.220 user 0m0.484s 00:33:08.220 sys 0m0.217s 00:33:08.220 10:40:45 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:08.220 10:40:45 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:33:08.220 ************************************ 00:33:08.220 END TEST bdev_hello_world 00:33:08.220 ************************************ 00:33:08.220 10:40:45 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:08.220 10:40:45 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:33:08.220 10:40:45 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:08.220 10:40:45 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:08.220 10:40:45 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:08.479 ************************************ 00:33:08.479 START TEST bdev_bounds 00:33:08.479 ************************************ 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=663929 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 663929' 00:33:08.479 Process bdevio pid: 663929 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 663929 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 663929 ']' 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:08.479 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:08.479 10:40:45 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:08.479 [2024-07-15 10:40:45.475984] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:08.479 [2024-07-15 10:40:45.476052] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid663929 ] 00:33:08.479 [2024-07-15 10:40:45.605566] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:33:08.738 [2024-07-15 10:40:45.714462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:08.738 [2024-07-15 10:40:45.714548] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:08.738 [2024-07-15 10:40:45.714552] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:08.738 [2024-07-15 10:40:45.894299] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:08.738 [2024-07-15 10:40:45.894369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:08.738 [2024-07-15 10:40:45.894384] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.738 [2024-07-15 10:40:45.902319] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:08.738 [2024-07-15 10:40:45.902338] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:08.738 [2024-07-15 10:40:45.902349] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:08.738 [2024-07-15 10:40:45.910341] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:08.738 [2024-07-15 10:40:45.910359] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:08.738 [2024-07-15 10:40:45.910370] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:09.305 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:09.305 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:33:09.305 10:40:46 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:09.564 I/O targets: 00:33:09.564 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:33:09.564 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:33:09.564 00:33:09.564 00:33:09.564 CUnit - A unit testing framework for C - Version 2.1-3 00:33:09.564 http://cunit.sourceforge.net/ 00:33:09.564 00:33:09.564 00:33:09.564 Suite: bdevio tests on: crypto_ram3 00:33:09.564 Test: blockdev write read block ...passed 00:33:09.564 Test: blockdev write zeroes read block ...passed 00:33:09.564 Test: blockdev write zeroes read no split ...passed 00:33:09.564 Test: blockdev write zeroes read split ...passed 00:33:09.564 Test: blockdev write zeroes read split partial ...passed 00:33:09.564 Test: blockdev reset ...passed 00:33:09.564 Test: blockdev write read 8 blocks ...passed 00:33:09.564 Test: blockdev write read size > 128k ...passed 00:33:09.564 Test: blockdev write read invalid size ...passed 00:33:09.564 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:09.564 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:09.564 Test: blockdev write read max offset ...passed 00:33:09.564 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:09.564 Test: blockdev writev readv 8 blocks ...passed 00:33:09.564 Test: blockdev writev readv 30 x 1block ...passed 00:33:09.564 Test: blockdev writev readv block ...passed 00:33:09.564 Test: blockdev writev readv size > 128k ...passed 00:33:09.564 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:09.564 Test: blockdev comparev and writev ...passed 00:33:09.564 Test: blockdev nvme passthru rw ...passed 00:33:09.564 Test: blockdev nvme passthru vendor specific ...passed 00:33:09.564 Test: blockdev nvme admin passthru ...passed 00:33:09.564 Test: blockdev copy ...passed 00:33:09.564 Suite: bdevio tests on: crypto_ram 00:33:09.564 Test: blockdev write read block ...passed 00:33:09.564 Test: blockdev write zeroes read block ...passed 00:33:09.564 Test: blockdev write zeroes read no split ...passed 00:33:09.564 Test: blockdev write zeroes read split ...passed 00:33:09.564 Test: blockdev write zeroes read split partial ...passed 00:33:09.564 Test: blockdev reset ...passed 00:33:09.564 Test: blockdev write read 8 blocks ...passed 00:33:09.564 Test: blockdev write read size > 128k ...passed 00:33:09.564 Test: blockdev write read invalid size ...passed 00:33:09.564 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:09.564 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:09.564 Test: blockdev write read max offset ...passed 00:33:09.564 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:09.564 Test: blockdev writev readv 8 blocks ...passed 00:33:09.564 Test: blockdev writev readv 30 x 1block ...passed 00:33:09.564 Test: blockdev writev readv block ...passed 00:33:09.564 Test: blockdev writev readv size > 128k ...passed 00:33:09.564 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:09.564 Test: blockdev comparev and writev ...passed 00:33:09.564 Test: blockdev nvme passthru rw ...passed 00:33:09.564 Test: blockdev nvme passthru vendor specific ...passed 00:33:09.564 Test: blockdev nvme admin passthru ...passed 00:33:09.564 Test: blockdev copy ...passed 00:33:09.564 00:33:09.564 Run Summary: Type Total Ran Passed Failed Inactive 00:33:09.564 suites 2 2 n/a 0 0 00:33:09.564 tests 46 46 46 0 0 00:33:09.564 asserts 260 260 260 0 n/a 00:33:09.564 00:33:09.564 Elapsed time = 0.084 seconds 00:33:09.564 0 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 663929 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 663929 ']' 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 663929 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 663929 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 663929' 00:33:09.564 killing process with pid 663929 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 663929 00:33:09.564 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 663929 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:33:09.823 00:33:09.823 real 0m1.429s 00:33:09.823 user 0m3.703s 00:33:09.823 sys 0m0.398s 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:33:09.823 ************************************ 00:33:09.823 END TEST bdev_bounds 00:33:09.823 ************************************ 00:33:09.823 10:40:46 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:09.823 10:40:46 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:09.823 10:40:46 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:33:09.823 10:40:46 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:09.823 10:40:46 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:09.823 ************************************ 00:33:09.823 START TEST bdev_nbd 00:33:09.823 ************************************ 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:33:09.823 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=664139 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 664139 /var/tmp/spdk-nbd.sock 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 664139 ']' 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:33:09.824 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:09.824 10:40:46 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:09.824 [2024-07-15 10:40:46.996613] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:09.824 [2024-07-15 10:40:46.996675] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:10.083 [2024-07-15 10:40:47.124784] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:10.083 [2024-07-15 10:40:47.228096] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:10.343 [2024-07-15 10:40:47.405137] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:10.343 [2024-07-15 10:40:47.405212] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:10.343 [2024-07-15 10:40:47.405228] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.343 [2024-07-15 10:40:47.413155] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:10.343 [2024-07-15 10:40:47.413175] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:10.343 [2024-07-15 10:40:47.413186] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.343 [2024-07-15 10:40:47.421177] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:10.343 [2024-07-15 10:40:47.421196] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:10.343 [2024-07-15 10:40:47.421207] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:10.938 10:40:47 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:11.196 1+0 records in 00:33:11.196 1+0 records out 00:33:11.196 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256917 s, 15.9 MB/s 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:11.196 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:33:11.455 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:33:11.455 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:33:11.455 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:33:11.455 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:11.455 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:11.455 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:11.455 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:11.456 1+0 records in 00:33:11.456 1+0 records out 00:33:11.456 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000327319 s, 12.5 MB/s 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:33:11.456 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:33:11.713 { 00:33:11.713 "nbd_device": "/dev/nbd0", 00:33:11.713 "bdev_name": "crypto_ram" 00:33:11.713 }, 00:33:11.713 { 00:33:11.713 "nbd_device": "/dev/nbd1", 00:33:11.713 "bdev_name": "crypto_ram3" 00:33:11.713 } 00:33:11.713 ]' 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:33:11.713 { 00:33:11.713 "nbd_device": "/dev/nbd0", 00:33:11.713 "bdev_name": "crypto_ram" 00:33:11.713 }, 00:33:11.713 { 00:33:11.713 "nbd_device": "/dev/nbd1", 00:33:11.713 "bdev_name": "crypto_ram3" 00:33:11.713 } 00:33:11.713 ]' 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.713 10:40:48 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:11.971 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:11.972 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:12.230 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:12.489 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:33:12.747 /dev/nbd0 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:12.747 1+0 records in 00:33:12.747 1+0 records out 00:33:12.747 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000215255 s, 19.0 MB/s 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:12.747 10:40:49 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:33:13.005 /dev/nbd1 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:33:13.005 1+0 records in 00:33:13.005 1+0 records out 00:33:13.005 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00035323 s, 11.6 MB/s 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:13.005 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:13.263 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:33:13.263 { 00:33:13.263 "nbd_device": "/dev/nbd0", 00:33:13.263 "bdev_name": "crypto_ram" 00:33:13.263 }, 00:33:13.263 { 00:33:13.263 "nbd_device": "/dev/nbd1", 00:33:13.263 "bdev_name": "crypto_ram3" 00:33:13.263 } 00:33:13.263 ]' 00:33:13.263 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:33:13.263 { 00:33:13.263 "nbd_device": "/dev/nbd0", 00:33:13.263 "bdev_name": "crypto_ram" 00:33:13.263 }, 00:33:13.263 { 00:33:13.263 "nbd_device": "/dev/nbd1", 00:33:13.263 "bdev_name": "crypto_ram3" 00:33:13.263 } 00:33:13.263 ]' 00:33:13.263 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:33:13.521 /dev/nbd1' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:33:13.521 /dev/nbd1' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:33:13.521 256+0 records in 00:33:13.521 256+0 records out 00:33:13.521 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0104696 s, 100 MB/s 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:33:13.521 256+0 records in 00:33:13.521 256+0 records out 00:33:13.521 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200455 s, 52.3 MB/s 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:33:13.521 256+0 records in 00:33:13.521 256+0 records out 00:33:13.521 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0362269 s, 28.9 MB/s 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:13.521 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:13.779 10:40:50 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:14.038 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:33:14.296 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:33:14.297 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:33:14.557 malloc_lvol_verify 00:33:14.557 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:33:14.816 ceafeccd-5da7-49c6-bfb8-5622848c750b 00:33:14.816 10:40:51 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:33:15.075 7738a869-b5c1-4471-99aa-59395eda3807 00:33:15.075 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:33:15.334 /dev/nbd0 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:33:15.334 mke2fs 1.46.5 (30-Dec-2021) 00:33:15.334 Discarding device blocks: 0/4096 done 00:33:15.334 Creating filesystem with 4096 1k blocks and 1024 inodes 00:33:15.334 00:33:15.334 Allocating group tables: 0/1 done 00:33:15.334 Writing inode tables: 0/1 done 00:33:15.334 Creating journal (1024 blocks): done 00:33:15.334 Writing superblocks and filesystem accounting information: 0/1 done 00:33:15.334 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:33:15.334 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 664139 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 664139 ']' 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 664139 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 664139 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 664139' 00:33:15.594 killing process with pid 664139 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 664139 00:33:15.594 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 664139 00:33:15.853 10:40:52 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:15.853 00:33:15.853 real 0m6.037s 00:33:15.853 user 0m8.729s 00:33:15.853 sys 0m2.373s 00:33:15.853 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:15.853 10:40:52 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:15.853 ************************************ 00:33:15.853 END TEST bdev_nbd 00:33:15.853 ************************************ 00:33:15.853 10:40:53 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:15.853 10:40:53 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:15.853 10:40:53 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:33:15.853 10:40:53 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:33:15.853 10:40:53 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:15.853 10:40:53 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:15.853 10:40:53 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.853 10:40:53 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:16.124 ************************************ 00:33:16.124 START TEST bdev_fio 00:33:16.124 ************************************ 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:16.124 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:16.124 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:16.125 ************************************ 00:33:16.125 START TEST bdev_fio_rw_verify 00:33:16.125 ************************************ 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:16.125 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:16.126 10:40:53 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:16.393 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:16.393 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:16.393 fio-3.35 00:33:16.393 Starting 2 threads 00:33:28.601 00:33:28.601 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=665247: Mon Jul 15 10:41:04 2024 00:33:28.601 read: IOPS=20.4k, BW=79.5MiB/s (83.4MB/s)(795MiB/10000msec) 00:33:28.601 slat (usec): min=14, max=203, avg=22.04, stdev= 7.49 00:33:28.601 clat (usec): min=7, max=703, avg=156.80, stdev=71.20 00:33:28.601 lat (usec): min=26, max=758, avg=178.84, stdev=74.94 00:33:28.601 clat percentiles (usec): 00:33:28.601 | 50.000th=[ 149], 99.000th=[ 392], 99.900th=[ 457], 99.990th=[ 545], 00:33:28.601 | 99.999th=[ 635] 00:33:28.601 write: IOPS=24.5k, BW=95.7MiB/s (100MB/s)(908MiB/9482msec); 0 zone resets 00:33:28.601 slat (usec): min=14, max=493, avg=36.48, stdev= 8.98 00:33:28.601 clat (usec): min=24, max=908, avg=210.03, stdev=105.79 00:33:28.601 lat (usec): min=51, max=1006, avg=246.51, stdev=110.10 00:33:28.601 clat percentiles (usec): 00:33:28.601 | 50.000th=[ 202], 99.000th=[ 537], 99.900th=[ 660], 99.990th=[ 742], 00:33:28.601 | 99.999th=[ 865] 00:33:28.601 bw ( KiB/s): min=69344, max=104560, per=94.38%, avg=92516.42, stdev=5774.71, samples=38 00:33:28.601 iops : min=17336, max=26140, avg=23129.11, stdev=1443.68, samples=38 00:33:28.601 lat (usec) : 10=0.01%, 20=0.01%, 50=3.83%, 100=14.16%, 250=60.28% 00:33:28.601 lat (usec) : 500=20.86%, 750=0.86%, 1000=0.01% 00:33:28.601 cpu : usr=99.58%, sys=0.01%, ctx=46, majf=0, minf=528 00:33:28.601 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:28.601 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:28.601 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:28.601 issued rwts: total=203634,232366,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:28.601 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:28.601 00:33:28.601 Run status group 0 (all jobs): 00:33:28.601 READ: bw=79.5MiB/s (83.4MB/s), 79.5MiB/s-79.5MiB/s (83.4MB/s-83.4MB/s), io=795MiB (834MB), run=10000-10000msec 00:33:28.601 WRITE: bw=95.7MiB/s (100MB/s), 95.7MiB/s-95.7MiB/s (100MB/s-100MB/s), io=908MiB (952MB), run=9482-9482msec 00:33:28.601 00:33:28.601 real 0m11.191s 00:33:28.601 user 0m23.824s 00:33:28.601 sys 0m0.357s 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:28.601 ************************************ 00:33:28.601 END TEST bdev_fio_rw_verify 00:33:28.601 ************************************ 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:28.601 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "fa1055fc-cd08-565a-a574-8b1dc49c387d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fa1055fc-cd08-565a-a574-8b1dc49c387d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a9331ac4-2116-57b2-b8cf-37b6b1819cd8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "a9331ac4-2116-57b2-b8cf-37b6b1819cd8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:28.602 crypto_ram3 ]] 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "fa1055fc-cd08-565a-a574-8b1dc49c387d"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "fa1055fc-cd08-565a-a574-8b1dc49c387d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "a9331ac4-2116-57b2-b8cf-37b6b1819cd8"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "a9331ac4-2116-57b2-b8cf-37b6b1819cd8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:28.602 ************************************ 00:33:28.602 START TEST bdev_fio_trim 00:33:28.602 ************************************ 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:28.602 10:41:04 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:28.602 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.602 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:28.602 fio-3.35 00:33:28.602 Starting 2 threads 00:33:38.581 00:33:38.581 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=666758: Mon Jul 15 10:41:15 2024 00:33:38.581 write: IOPS=39.6k, BW=155MiB/s (162MB/s)(1549MiB/10001msec); 0 zone resets 00:33:38.581 slat (nsec): min=14107, max=97026, avg=22182.97, stdev=4261.65 00:33:38.581 clat (usec): min=23, max=1977, avg=165.83, stdev=91.32 00:33:38.581 lat (usec): min=41, max=2003, avg=188.01, stdev=94.60 00:33:38.581 clat percentiles (usec): 00:33:38.581 | 50.000th=[ 133], 99.000th=[ 343], 99.900th=[ 363], 99.990th=[ 490], 00:33:38.581 | 99.999th=[ 1926] 00:33:38.581 bw ( KiB/s): min=155256, max=161280, per=100.00%, avg=158610.11, stdev=663.04, samples=38 00:33:38.581 iops : min=38814, max=40320, avg=39652.53, stdev=165.76, samples=38 00:33:38.581 trim: IOPS=39.6k, BW=155MiB/s (162MB/s)(1549MiB/10001msec); 0 zone resets 00:33:38.581 slat (nsec): min=5963, max=61791, avg=9961.49, stdev=2210.79 00:33:38.581 clat (usec): min=42, max=1805, avg=110.73, stdev=33.48 00:33:38.581 lat (usec): min=51, max=1816, avg=120.69, stdev=33.59 00:33:38.581 clat percentiles (usec): 00:33:38.581 | 50.000th=[ 112], 99.000th=[ 182], 99.900th=[ 194], 99.990th=[ 310], 00:33:38.581 | 99.999th=[ 652] 00:33:38.581 bw ( KiB/s): min=155288, max=161280, per=100.00%, avg=158611.79, stdev=661.47, samples=38 00:33:38.581 iops : min=38822, max=40320, avg=39652.95, stdev=165.37, samples=38 00:33:38.581 lat (usec) : 50=3.83%, 100=33.12%, 250=49.76%, 500=13.28%, 750=0.01% 00:33:38.581 lat (usec) : 1000=0.01% 00:33:38.581 lat (msec) : 2=0.01% 00:33:38.581 cpu : usr=99.61%, sys=0.00%, ctx=22, majf=0, minf=345 00:33:38.581 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:38.581 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:38.581 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:38.581 issued rwts: total=0,396436,396436,0 short=0,0,0,0 dropped=0,0,0,0 00:33:38.581 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:38.581 00:33:38.581 Run status group 0 (all jobs): 00:33:38.581 WRITE: bw=155MiB/s (162MB/s), 155MiB/s-155MiB/s (162MB/s-162MB/s), io=1549MiB (1624MB), run=10001-10001msec 00:33:38.581 TRIM: bw=155MiB/s (162MB/s), 155MiB/s-155MiB/s (162MB/s-162MB/s), io=1549MiB (1624MB), run=10001-10001msec 00:33:38.581 00:33:38.581 real 0m11.112s 00:33:38.581 user 0m23.368s 00:33:38.581 sys 0m0.365s 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:38.581 ************************************ 00:33:38.581 END TEST bdev_fio_trim 00:33:38.581 ************************************ 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:38.581 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:38.581 00:33:38.581 real 0m22.656s 00:33:38.581 user 0m47.369s 00:33:38.581 sys 0m0.921s 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:38.581 10:41:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:38.581 ************************************ 00:33:38.581 END TEST bdev_fio 00:33:38.581 ************************************ 00:33:38.581 10:41:15 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:38.581 10:41:15 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:38.581 10:41:15 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:38.581 10:41:15 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:38.581 10:41:15 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:38.581 10:41:15 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:38.839 ************************************ 00:33:38.839 START TEST bdev_verify 00:33:38.840 ************************************ 00:33:38.840 10:41:15 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:38.840 [2024-07-15 10:41:15.842161] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:38.840 [2024-07-15 10:41:15.842220] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid668176 ] 00:33:38.840 [2024-07-15 10:41:15.970163] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:39.098 [2024-07-15 10:41:16.068868] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:39.098 [2024-07-15 10:41:16.068874] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:39.098 [2024-07-15 10:41:16.235145] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:39.098 [2024-07-15 10:41:16.235215] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:39.098 [2024-07-15 10:41:16.235230] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:39.098 [2024-07-15 10:41:16.243163] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:39.098 [2024-07-15 10:41:16.243181] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:39.098 [2024-07-15 10:41:16.243192] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:39.098 [2024-07-15 10:41:16.251183] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:39.098 [2024-07-15 10:41:16.251200] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:39.098 [2024-07-15 10:41:16.251212] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:39.357 Running I/O for 5 seconds... 00:33:44.681 00:33:44.681 Latency(us) 00:33:44.681 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:44.681 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:44.681 Verification LBA range: start 0x0 length 0x800 00:33:44.681 crypto_ram : 5.01 5907.21 23.08 0.00 0.00 21577.72 1702.51 29633.67 00:33:44.681 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:44.681 Verification LBA range: start 0x800 length 0x800 00:33:44.681 crypto_ram : 5.02 5920.92 23.13 0.00 0.00 21532.01 1617.03 29633.67 00:33:44.681 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:44.681 Verification LBA range: start 0x0 length 0x800 00:33:44.681 crypto_ram3 : 5.03 2978.06 11.63 0.00 0.00 42732.25 2179.78 31229.33 00:33:44.681 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:44.681 Verification LBA range: start 0x800 length 0x800 00:33:44.681 crypto_ram3 : 5.03 2975.86 11.62 0.00 0.00 42758.51 1923.34 31229.33 00:33:44.681 =================================================================================================================== 00:33:44.681 Total : 17782.04 69.46 0.00 0.00 28669.01 1617.03 31229.33 00:33:44.681 00:33:44.681 real 0m5.801s 00:33:44.681 user 0m10.929s 00:33:44.681 sys 0m0.221s 00:33:44.681 10:41:21 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:44.681 10:41:21 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:44.681 ************************************ 00:33:44.681 END TEST bdev_verify 00:33:44.681 ************************************ 00:33:44.681 10:41:21 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:44.681 10:41:21 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:44.681 10:41:21 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:44.681 10:41:21 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:44.681 10:41:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:44.681 ************************************ 00:33:44.681 START TEST bdev_verify_big_io 00:33:44.681 ************************************ 00:33:44.681 10:41:21 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:44.681 [2024-07-15 10:41:21.689347] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:44.681 [2024-07-15 10:41:21.689389] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid668895 ] 00:33:44.681 [2024-07-15 10:41:21.801882] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:44.940 [2024-07-15 10:41:21.903870] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:44.940 [2024-07-15 10:41:21.903876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:44.940 [2024-07-15 10:41:22.065365] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:44.940 [2024-07-15 10:41:22.065428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:44.941 [2024-07-15 10:41:22.065442] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.941 [2024-07-15 10:41:22.073385] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:44.941 [2024-07-15 10:41:22.073404] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:44.941 [2024-07-15 10:41:22.073416] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.941 [2024-07-15 10:41:22.081408] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:44.941 [2024-07-15 10:41:22.081425] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:44.941 [2024-07-15 10:41:22.081436] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:44.941 Running I/O for 5 seconds... 00:33:51.508 00:33:51.508 Latency(us) 00:33:51.508 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:51.508 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:51.508 Verification LBA range: start 0x0 length 0x80 00:33:51.508 crypto_ram : 5.29 411.02 25.69 0.00 0.00 303435.29 8605.16 399370.69 00:33:51.508 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:51.508 Verification LBA range: start 0x80 length 0x80 00:33:51.508 crypto_ram : 5.29 411.02 25.69 0.00 0.00 303423.48 8548.17 399370.69 00:33:51.508 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:51.508 Verification LBA range: start 0x0 length 0x80 00:33:51.508 crypto_ram3 : 5.31 216.83 13.55 0.00 0.00 551941.76 7009.50 413959.57 00:33:51.508 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:51.508 Verification LBA range: start 0x80 length 0x80 00:33:51.508 crypto_ram3 : 5.31 216.83 13.55 0.00 0.00 551925.13 6981.01 408488.74 00:33:51.508 =================================================================================================================== 00:33:51.508 Total : 1255.69 78.48 0.00 0.00 389450.02 6981.01 413959.57 00:33:51.508 00:33:51.508 real 0m6.061s 00:33:51.508 user 0m11.485s 00:33:51.508 sys 0m0.217s 00:33:51.508 10:41:27 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:51.508 10:41:27 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:51.508 ************************************ 00:33:51.508 END TEST bdev_verify_big_io 00:33:51.508 ************************************ 00:33:51.508 10:41:27 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:51.508 10:41:27 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:51.508 10:41:27 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:51.508 10:41:27 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:51.508 10:41:27 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:51.508 ************************************ 00:33:51.508 START TEST bdev_write_zeroes 00:33:51.508 ************************************ 00:33:51.508 10:41:27 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:51.508 [2024-07-15 10:41:27.860585] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:51.508 [2024-07-15 10:41:27.860648] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid669643 ] 00:33:51.508 [2024-07-15 10:41:27.987907] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:51.508 [2024-07-15 10:41:28.085002] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:51.508 [2024-07-15 10:41:28.254577] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:51.508 [2024-07-15 10:41:28.254642] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:51.508 [2024-07-15 10:41:28.254657] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.508 [2024-07-15 10:41:28.262594] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:33:51.508 [2024-07-15 10:41:28.262613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:51.508 [2024-07-15 10:41:28.262625] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.508 [2024-07-15 10:41:28.270616] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:33:51.508 [2024-07-15 10:41:28.270634] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:33:51.508 [2024-07-15 10:41:28.270645] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:51.508 Running I/O for 1 seconds... 00:33:52.446 00:33:52.446 Latency(us) 00:33:52.446 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:52.446 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:52.446 crypto_ram : 1.01 26651.49 104.11 0.00 0.00 4789.78 1289.35 6439.62 00:33:52.446 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:52.446 crypto_ram3 : 1.01 13298.73 51.95 0.00 0.00 9556.90 5926.73 9801.91 00:33:52.446 =================================================================================================================== 00:33:52.446 Total : 39950.22 156.06 0.00 0.00 6378.82 1289.35 9801.91 00:33:52.446 00:33:52.446 real 0m1.764s 00:33:52.446 user 0m1.513s 00:33:52.446 sys 0m0.232s 00:33:52.446 10:41:29 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:52.446 10:41:29 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:52.446 ************************************ 00:33:52.446 END TEST bdev_write_zeroes 00:33:52.446 ************************************ 00:33:52.446 10:41:29 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:33:52.446 10:41:29 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:52.446 10:41:29 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:52.446 10:41:29 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:52.446 10:41:29 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:52.706 ************************************ 00:33:52.706 START TEST bdev_json_nonenclosed 00:33:52.706 ************************************ 00:33:52.706 10:41:29 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:52.706 [2024-07-15 10:41:29.697570] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:52.706 [2024-07-15 10:41:29.697631] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid669976 ] 00:33:52.706 [2024-07-15 10:41:29.824471] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:52.965 [2024-07-15 10:41:29.932867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:52.965 [2024-07-15 10:41:29.932935] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:52.965 [2024-07-15 10:41:29.932957] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:52.965 [2024-07-15 10:41:29.932968] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:52.965 00:33:52.965 real 0m0.393s 00:33:52.965 user 0m0.238s 00:33:52.965 sys 0m0.153s 00:33:52.965 10:41:30 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:52.965 10:41:30 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:52.965 10:41:30 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:52.965 ************************************ 00:33:52.965 END TEST bdev_json_nonenclosed 00:33:52.965 ************************************ 00:33:52.965 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:52.966 10:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:33:52.966 10:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:52.966 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:52.966 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:52.966 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:52.966 ************************************ 00:33:52.966 START TEST bdev_json_nonarray 00:33:52.966 ************************************ 00:33:52.966 10:41:30 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:53.225 [2024-07-15 10:41:30.175094] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:53.225 [2024-07-15 10:41:30.175156] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid670004 ] 00:33:53.225 [2024-07-15 10:41:30.302963] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:53.225 [2024-07-15 10:41:30.411159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:53.225 [2024-07-15 10:41:30.411233] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:53.225 [2024-07-15 10:41:30.411254] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:53.225 [2024-07-15 10:41:30.411266] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:53.484 00:33:53.484 real 0m0.397s 00:33:53.484 user 0m0.245s 00:33:53.484 sys 0m0.150s 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:53.484 ************************************ 00:33:53.484 END TEST bdev_json_nonarray 00:33:53.484 ************************************ 00:33:53.484 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:33:53.484 10:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:33:53.484 10:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:33:53.484 10:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:33:53.484 10:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:33:53.484 10:41:30 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:33:53.484 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:33:53.484 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:53.484 10:41:30 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:33:53.484 ************************************ 00:33:53.484 START TEST bdev_crypto_enomem 00:33:53.484 ************************************ 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=670046 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 670046 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 670046 ']' 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:53.484 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:53.484 10:41:30 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:53.484 [2024-07-15 10:41:30.657467] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:33:53.484 [2024-07-15 10:41:30.657541] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid670046 ] 00:33:53.744 [2024-07-15 10:41:30.777904] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:53.744 [2024-07-15 10:41:30.890465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:54.680 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:54.680 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:33:54.680 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:33:54.680 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:54.681 true 00:33:54.681 base0 00:33:54.681 true 00:33:54.681 [2024-07-15 10:41:31.620241] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:33:54.681 crypt0 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:54.681 [ 00:33:54.681 { 00:33:54.681 "name": "crypt0", 00:33:54.681 "aliases": [ 00:33:54.681 "7285fd14-caba-5f2a-89d2-547672ba7f8b" 00:33:54.681 ], 00:33:54.681 "product_name": "crypto", 00:33:54.681 "block_size": 512, 00:33:54.681 "num_blocks": 2097152, 00:33:54.681 "uuid": "7285fd14-caba-5f2a-89d2-547672ba7f8b", 00:33:54.681 "assigned_rate_limits": { 00:33:54.681 "rw_ios_per_sec": 0, 00:33:54.681 "rw_mbytes_per_sec": 0, 00:33:54.681 "r_mbytes_per_sec": 0, 00:33:54.681 "w_mbytes_per_sec": 0 00:33:54.681 }, 00:33:54.681 "claimed": false, 00:33:54.681 "zoned": false, 00:33:54.681 "supported_io_types": { 00:33:54.681 "read": true, 00:33:54.681 "write": true, 00:33:54.681 "unmap": false, 00:33:54.681 "flush": false, 00:33:54.681 "reset": true, 00:33:54.681 "nvme_admin": false, 00:33:54.681 "nvme_io": false, 00:33:54.681 "nvme_io_md": false, 00:33:54.681 "write_zeroes": true, 00:33:54.681 "zcopy": false, 00:33:54.681 "get_zone_info": false, 00:33:54.681 "zone_management": false, 00:33:54.681 "zone_append": false, 00:33:54.681 "compare": false, 00:33:54.681 "compare_and_write": false, 00:33:54.681 "abort": false, 00:33:54.681 "seek_hole": false, 00:33:54.681 "seek_data": false, 00:33:54.681 "copy": false, 00:33:54.681 "nvme_iov_md": false 00:33:54.681 }, 00:33:54.681 "memory_domains": [ 00:33:54.681 { 00:33:54.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:33:54.681 "dma_device_type": 2 00:33:54.681 } 00:33:54.681 ], 00:33:54.681 "driver_specific": { 00:33:54.681 "crypto": { 00:33:54.681 "base_bdev_name": "EE_base0", 00:33:54.681 "name": "crypt0", 00:33:54.681 "key_name": "test_dek_sw" 00:33:54.681 } 00:33:54.681 } 00:33:54.681 } 00:33:54.681 ] 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=670202 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:33:54.681 10:41:31 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:54.681 Running I/O for 5 seconds... 00:33:55.617 10:41:32 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:33:55.617 10:41:32 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:55.617 10:41:32 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:55.617 10:41:32 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:55.617 10:41:32 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 670202 00:33:59.806 00:33:59.806 Latency(us) 00:33:59.806 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:59.806 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:33:59.806 crypt0 : 5.00 36202.22 141.41 0.00 0.00 880.24 425.63 1175.37 00:33:59.806 =================================================================================================================== 00:33:59.806 Total : 36202.22 141.41 0.00 0.00 880.24 425.63 1175.37 00:33:59.806 0 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 670046 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 670046 ']' 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 670046 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 670046 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 670046' 00:33:59.806 killing process with pid 670046 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 670046 00:33:59.806 Received shutdown signal, test time was about 5.000000 seconds 00:33:59.806 00:33:59.806 Latency(us) 00:33:59.806 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:59.806 =================================================================================================================== 00:33:59.806 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:59.806 10:41:36 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 670046 00:34:00.065 10:41:37 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:34:00.065 00:34:00.065 real 0m6.471s 00:34:00.065 user 0m6.723s 00:34:00.065 sys 0m0.384s 00:34:00.065 10:41:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:00.065 10:41:37 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:34:00.065 ************************************ 00:34:00.065 END TEST bdev_crypto_enomem 00:34:00.065 ************************************ 00:34:00.065 10:41:37 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:34:00.065 10:41:37 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:34:00.065 00:34:00.065 real 0m54.510s 00:34:00.065 user 1m33.826s 00:34:00.065 sys 0m6.425s 00:34:00.065 10:41:37 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:00.065 10:41:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:34:00.065 ************************************ 00:34:00.065 END TEST blockdev_crypto_sw 00:34:00.065 ************************************ 00:34:00.065 10:41:37 -- common/autotest_common.sh@1142 -- # return 0 00:34:00.065 10:41:37 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:00.065 10:41:37 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:00.065 10:41:37 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:00.065 10:41:37 -- common/autotest_common.sh@10 -- # set +x 00:34:00.065 ************************************ 00:34:00.065 START TEST blockdev_crypto_qat 00:34:00.065 ************************************ 00:34:00.065 10:41:37 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:34:00.325 * Looking for test storage... 00:34:00.325 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=670977 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 670977 00:34:00.325 10:41:37 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:00.325 10:41:37 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 670977 ']' 00:34:00.325 10:41:37 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:00.325 10:41:37 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:00.325 10:41:37 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:00.325 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:00.325 10:41:37 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:00.325 10:41:37 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:00.325 [2024-07-15 10:41:37.402548] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:34:00.325 [2024-07-15 10:41:37.402625] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid670977 ] 00:34:00.584 [2024-07-15 10:41:37.531904] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:00.585 [2024-07-15 10:41:37.637424] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:01.152 10:41:38 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:01.152 10:41:38 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:34:01.152 10:41:38 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:34:01.152 10:41:38 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:34:01.152 10:41:38 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:34:01.152 10:41:38 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:01.152 10:41:38 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:01.152 [2024-07-15 10:41:38.335653] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:01.152 [2024-07-15 10:41:38.343687] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:01.412 [2024-07-15 10:41:38.351709] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:01.412 [2024-07-15 10:41:38.422634] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:03.945 true 00:34:03.945 true 00:34:03.945 true 00:34:03.945 true 00:34:03.945 Malloc0 00:34:03.945 Malloc1 00:34:03.945 Malloc2 00:34:03.945 Malloc3 00:34:03.945 [2024-07-15 10:41:40.785131] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:03.945 crypto_ram 00:34:03.945 [2024-07-15 10:41:40.793149] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:03.945 crypto_ram1 00:34:03.945 [2024-07-15 10:41:40.801173] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:03.945 crypto_ram2 00:34:03.945 [2024-07-15 10:41:40.809195] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:03.945 crypto_ram3 00:34:03.945 [ 00:34:03.945 { 00:34:03.945 "name": "Malloc1", 00:34:03.945 "aliases": [ 00:34:03.945 "5915f7a0-0824-4e60-a92b-e622549f6532" 00:34:03.945 ], 00:34:03.945 "product_name": "Malloc disk", 00:34:03.945 "block_size": 512, 00:34:03.945 "num_blocks": 65536, 00:34:03.945 "uuid": "5915f7a0-0824-4e60-a92b-e622549f6532", 00:34:03.945 "assigned_rate_limits": { 00:34:03.945 "rw_ios_per_sec": 0, 00:34:03.945 "rw_mbytes_per_sec": 0, 00:34:03.945 "r_mbytes_per_sec": 0, 00:34:03.945 "w_mbytes_per_sec": 0 00:34:03.945 }, 00:34:03.945 "claimed": true, 00:34:03.945 "claim_type": "exclusive_write", 00:34:03.945 "zoned": false, 00:34:03.945 "supported_io_types": { 00:34:03.945 "read": true, 00:34:03.945 "write": true, 00:34:03.945 "unmap": true, 00:34:03.945 "flush": true, 00:34:03.945 "reset": true, 00:34:03.945 "nvme_admin": false, 00:34:03.945 "nvme_io": false, 00:34:03.945 "nvme_io_md": false, 00:34:03.945 "write_zeroes": true, 00:34:03.945 "zcopy": true, 00:34:03.945 "get_zone_info": false, 00:34:03.945 "zone_management": false, 00:34:03.945 "zone_append": false, 00:34:03.945 "compare": false, 00:34:03.945 "compare_and_write": false, 00:34:03.945 "abort": true, 00:34:03.945 "seek_hole": false, 00:34:03.945 "seek_data": false, 00:34:03.945 "copy": true, 00:34:03.945 "nvme_iov_md": false 00:34:03.945 }, 00:34:03.945 "memory_domains": [ 00:34:03.945 { 00:34:03.945 "dma_device_id": "system", 00:34:03.945 "dma_device_type": 1 00:34:03.945 }, 00:34:03.945 { 00:34:03.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:34:03.945 "dma_device_type": 2 00:34:03.945 } 00:34:03.945 ], 00:34:03.945 "driver_specific": {} 00:34:03.945 } 00:34:03.945 ] 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:34:03.945 10:41:40 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.945 10:41:40 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.945 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:34:03.945 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:34:03.945 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "18dc96c5-f49b-5ab1-96d1-b74793753fdd"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "18dc96c5-f49b-5ab1-96d1-b74793753fdd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9fc54cd8-9ee7-5a4b-b557-52b732dce01f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9fc54cd8-9ee7-5a4b-b557-52b732dce01f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "55d2336a-82b5-5842-821c-ea5678d01b82"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "55d2336a-82b5-5842-821c-ea5678d01b82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b6c7c11e-5b6d-5164-b8bc-20dcd2f061df"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b6c7c11e-5b6d-5164-b8bc-20dcd2f061df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:03.945 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:34:03.945 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:34:03.945 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:34:03.945 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 670977 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 670977 ']' 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 670977 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 670977 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 670977' 00:34:03.945 killing process with pid 670977 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 670977 00:34:03.945 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 670977 00:34:04.512 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:04.512 10:41:41 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:04.512 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:04.512 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:04.512 10:41:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:04.512 ************************************ 00:34:04.512 START TEST bdev_hello_world 00:34:04.512 ************************************ 00:34:04.512 10:41:41 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:04.771 [2024-07-15 10:41:41.728488] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:34:04.771 [2024-07-15 10:41:41.728549] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid671522 ] 00:34:04.771 [2024-07-15 10:41:41.856295] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:04.771 [2024-07-15 10:41:41.961499] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:05.029 [2024-07-15 10:41:41.982797] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:05.029 [2024-07-15 10:41:41.990826] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:05.029 [2024-07-15 10:41:41.998852] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:05.029 [2024-07-15 10:41:42.111010] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:07.591 [2024-07-15 10:41:44.330270] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:07.591 [2024-07-15 10:41:44.330339] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:07.591 [2024-07-15 10:41:44.330354] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.591 [2024-07-15 10:41:44.338290] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:07.591 [2024-07-15 10:41:44.338312] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:07.591 [2024-07-15 10:41:44.338325] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.591 [2024-07-15 10:41:44.346308] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:07.591 [2024-07-15 10:41:44.346326] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:07.591 [2024-07-15 10:41:44.346338] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.591 [2024-07-15 10:41:44.354329] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:07.591 [2024-07-15 10:41:44.354346] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:07.591 [2024-07-15 10:41:44.354357] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:07.591 [2024-07-15 10:41:44.431767] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:07.591 [2024-07-15 10:41:44.431815] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:07.591 [2024-07-15 10:41:44.431834] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:07.591 [2024-07-15 10:41:44.433124] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:07.591 [2024-07-15 10:41:44.433194] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:07.591 [2024-07-15 10:41:44.433210] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:07.591 [2024-07-15 10:41:44.433253] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:07.591 00:34:07.591 [2024-07-15 10:41:44.433272] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:07.849 00:34:07.849 real 0m3.150s 00:34:07.849 user 0m2.712s 00:34:07.849 sys 0m0.401s 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:07.849 ************************************ 00:34:07.849 END TEST bdev_hello_world 00:34:07.849 ************************************ 00:34:07.849 10:41:44 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:07.849 10:41:44 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:34:07.849 10:41:44 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:07.849 10:41:44 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:07.849 10:41:44 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:07.849 ************************************ 00:34:07.849 START TEST bdev_bounds 00:34:07.849 ************************************ 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=672054 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 672054' 00:34:07.849 Process bdevio pid: 672054 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 672054 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 672054 ']' 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:07.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:07.849 10:41:44 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:07.849 [2024-07-15 10:41:44.931051] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:34:07.849 [2024-07-15 10:41:44.931114] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid672054 ] 00:34:08.108 [2024-07-15 10:41:45.060205] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:08.108 [2024-07-15 10:41:45.166187] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:08.108 [2024-07-15 10:41:45.166272] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:08.108 [2024-07-15 10:41:45.166278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:08.108 [2024-07-15 10:41:45.187700] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:08.108 [2024-07-15 10:41:45.195740] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:08.108 [2024-07-15 10:41:45.203742] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:08.108 [2024-07-15 10:41:45.305380] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:10.641 [2024-07-15 10:41:47.506690] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:10.641 [2024-07-15 10:41:47.506791] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:10.641 [2024-07-15 10:41:47.506806] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.641 [2024-07-15 10:41:47.514706] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:10.641 [2024-07-15 10:41:47.514726] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:10.641 [2024-07-15 10:41:47.514739] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.641 [2024-07-15 10:41:47.522733] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:10.641 [2024-07-15 10:41:47.522752] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:10.641 [2024-07-15 10:41:47.522764] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.641 [2024-07-15 10:41:47.530755] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:10.641 [2024-07-15 10:41:47.530773] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:10.641 [2024-07-15 10:41:47.530785] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:10.641 10:41:47 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:10.641 10:41:47 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:10.641 10:41:47 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:10.641 I/O targets: 00:34:10.641 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:10.641 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:34:10.641 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:34:10.641 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:10.641 00:34:10.641 00:34:10.641 CUnit - A unit testing framework for C - Version 2.1-3 00:34:10.641 http://cunit.sourceforge.net/ 00:34:10.641 00:34:10.641 00:34:10.641 Suite: bdevio tests on: crypto_ram3 00:34:10.641 Test: blockdev write read block ...passed 00:34:10.641 Test: blockdev write zeroes read block ...passed 00:34:10.641 Test: blockdev write zeroes read no split ...passed 00:34:10.641 Test: blockdev write zeroes read split ...passed 00:34:10.641 Test: blockdev write zeroes read split partial ...passed 00:34:10.641 Test: blockdev reset ...passed 00:34:10.641 Test: blockdev write read 8 blocks ...passed 00:34:10.641 Test: blockdev write read size > 128k ...passed 00:34:10.641 Test: blockdev write read invalid size ...passed 00:34:10.641 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:10.641 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:10.641 Test: blockdev write read max offset ...passed 00:34:10.641 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:10.641 Test: blockdev writev readv 8 blocks ...passed 00:34:10.641 Test: blockdev writev readv 30 x 1block ...passed 00:34:10.641 Test: blockdev writev readv block ...passed 00:34:10.641 Test: blockdev writev readv size > 128k ...passed 00:34:10.641 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:10.641 Test: blockdev comparev and writev ...passed 00:34:10.641 Test: blockdev nvme passthru rw ...passed 00:34:10.641 Test: blockdev nvme passthru vendor specific ...passed 00:34:10.641 Test: blockdev nvme admin passthru ...passed 00:34:10.641 Test: blockdev copy ...passed 00:34:10.641 Suite: bdevio tests on: crypto_ram2 00:34:10.641 Test: blockdev write read block ...passed 00:34:10.641 Test: blockdev write zeroes read block ...passed 00:34:10.641 Test: blockdev write zeroes read no split ...passed 00:34:10.641 Test: blockdev write zeroes read split ...passed 00:34:10.641 Test: blockdev write zeroes read split partial ...passed 00:34:10.641 Test: blockdev reset ...passed 00:34:10.641 Test: blockdev write read 8 blocks ...passed 00:34:10.641 Test: blockdev write read size > 128k ...passed 00:34:10.641 Test: blockdev write read invalid size ...passed 00:34:10.641 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:10.641 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:10.641 Test: blockdev write read max offset ...passed 00:34:10.641 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:10.641 Test: blockdev writev readv 8 blocks ...passed 00:34:10.641 Test: blockdev writev readv 30 x 1block ...passed 00:34:10.641 Test: blockdev writev readv block ...passed 00:34:10.641 Test: blockdev writev readv size > 128k ...passed 00:34:10.641 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:10.641 Test: blockdev comparev and writev ...passed 00:34:10.641 Test: blockdev nvme passthru rw ...passed 00:34:10.641 Test: blockdev nvme passthru vendor specific ...passed 00:34:10.641 Test: blockdev nvme admin passthru ...passed 00:34:10.641 Test: blockdev copy ...passed 00:34:10.641 Suite: bdevio tests on: crypto_ram1 00:34:10.641 Test: blockdev write read block ...passed 00:34:10.641 Test: blockdev write zeroes read block ...passed 00:34:10.641 Test: blockdev write zeroes read no split ...passed 00:34:10.900 Test: blockdev write zeroes read split ...passed 00:34:10.900 Test: blockdev write zeroes read split partial ...passed 00:34:10.900 Test: blockdev reset ...passed 00:34:10.900 Test: blockdev write read 8 blocks ...passed 00:34:10.900 Test: blockdev write read size > 128k ...passed 00:34:10.900 Test: blockdev write read invalid size ...passed 00:34:10.900 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:10.900 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:10.900 Test: blockdev write read max offset ...passed 00:34:10.900 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:10.900 Test: blockdev writev readv 8 blocks ...passed 00:34:10.900 Test: blockdev writev readv 30 x 1block ...passed 00:34:10.900 Test: blockdev writev readv block ...passed 00:34:10.900 Test: blockdev writev readv size > 128k ...passed 00:34:10.900 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:10.900 Test: blockdev comparev and writev ...passed 00:34:10.900 Test: blockdev nvme passthru rw ...passed 00:34:10.900 Test: blockdev nvme passthru vendor specific ...passed 00:34:10.900 Test: blockdev nvme admin passthru ...passed 00:34:10.900 Test: blockdev copy ...passed 00:34:10.900 Suite: bdevio tests on: crypto_ram 00:34:10.900 Test: blockdev write read block ...passed 00:34:10.900 Test: blockdev write zeroes read block ...passed 00:34:10.900 Test: blockdev write zeroes read no split ...passed 00:34:10.900 Test: blockdev write zeroes read split ...passed 00:34:10.900 Test: blockdev write zeroes read split partial ...passed 00:34:10.900 Test: blockdev reset ...passed 00:34:10.900 Test: blockdev write read 8 blocks ...passed 00:34:10.900 Test: blockdev write read size > 128k ...passed 00:34:10.900 Test: blockdev write read invalid size ...passed 00:34:10.900 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:10.900 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:10.900 Test: blockdev write read max offset ...passed 00:34:10.900 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:10.900 Test: blockdev writev readv 8 blocks ...passed 00:34:10.900 Test: blockdev writev readv 30 x 1block ...passed 00:34:10.900 Test: blockdev writev readv block ...passed 00:34:10.900 Test: blockdev writev readv size > 128k ...passed 00:34:10.900 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:10.900 Test: blockdev comparev and writev ...passed 00:34:10.900 Test: blockdev nvme passthru rw ...passed 00:34:10.900 Test: blockdev nvme passthru vendor specific ...passed 00:34:10.900 Test: blockdev nvme admin passthru ...passed 00:34:10.900 Test: blockdev copy ...passed 00:34:10.900 00:34:10.900 Run Summary: Type Total Ran Passed Failed Inactive 00:34:10.900 suites 4 4 n/a 0 0 00:34:10.900 tests 92 92 92 0 0 00:34:10.900 asserts 520 520 520 0 n/a 00:34:10.900 00:34:10.900 Elapsed time = 0.515 seconds 00:34:10.900 0 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 672054 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 672054 ']' 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 672054 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 672054 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:10.900 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:10.901 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 672054' 00:34:10.901 killing process with pid 672054 00:34:10.901 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 672054 00:34:10.901 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 672054 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:34:11.468 00:34:11.468 real 0m3.606s 00:34:11.468 user 0m10.082s 00:34:11.468 sys 0m0.517s 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:11.468 ************************************ 00:34:11.468 END TEST bdev_bounds 00:34:11.468 ************************************ 00:34:11.468 10:41:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:11.468 10:41:48 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:11.468 10:41:48 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:11.468 10:41:48 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:11.468 10:41:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:11.468 ************************************ 00:34:11.468 START TEST bdev_nbd 00:34:11.468 ************************************ 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=672452 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 672452 /var/tmp/spdk-nbd.sock 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 672452 ']' 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:11.468 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:11.468 10:41:48 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:11.468 [2024-07-15 10:41:48.638116] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:34:11.468 [2024-07-15 10:41:48.638177] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:11.727 [2024-07-15 10:41:48.768712] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:11.727 [2024-07-15 10:41:48.871044] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:11.727 [2024-07-15 10:41:48.892322] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:11.727 [2024-07-15 10:41:48.900345] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:11.727 [2024-07-15 10:41:48.908363] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:11.985 [2024-07-15 10:41:49.014261] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:14.520 [2024-07-15 10:41:51.221393] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:14.520 [2024-07-15 10:41:51.221461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:14.520 [2024-07-15 10:41:51.221477] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.520 [2024-07-15 10:41:51.229413] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:14.520 [2024-07-15 10:41:51.229432] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:14.520 [2024-07-15 10:41:51.229444] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.520 [2024-07-15 10:41:51.237432] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:14.520 [2024-07-15 10:41:51.237449] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:14.520 [2024-07-15 10:41:51.237460] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.520 [2024-07-15 10:41:51.245452] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:14.520 [2024-07-15 10:41:51.245471] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:14.520 [2024-07-15 10:41:51.245483] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:14.520 1+0 records in 00:34:14.520 1+0 records out 00:34:14.520 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240833 s, 17.0 MB/s 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:14.520 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:34:14.779 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:14.779 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:14.779 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:14.779 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:14.779 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:14.780 1+0 records in 00:34:14.780 1+0 records out 00:34:14.780 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025013 s, 16.4 MB/s 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:14.780 10:41:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:15.038 1+0 records in 00:34:15.038 1+0 records out 00:34:15.038 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299715 s, 13.7 MB/s 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:15.038 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:15.039 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:15.039 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:15.296 1+0 records in 00:34:15.296 1+0 records out 00:34:15.296 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000387145 s, 10.6 MB/s 00:34:15.296 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:15.553 { 00:34:15.553 "nbd_device": "/dev/nbd0", 00:34:15.553 "bdev_name": "crypto_ram" 00:34:15.553 }, 00:34:15.553 { 00:34:15.553 "nbd_device": "/dev/nbd1", 00:34:15.553 "bdev_name": "crypto_ram1" 00:34:15.553 }, 00:34:15.553 { 00:34:15.553 "nbd_device": "/dev/nbd2", 00:34:15.553 "bdev_name": "crypto_ram2" 00:34:15.553 }, 00:34:15.553 { 00:34:15.553 "nbd_device": "/dev/nbd3", 00:34:15.553 "bdev_name": "crypto_ram3" 00:34:15.553 } 00:34:15.553 ]' 00:34:15.553 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:15.812 { 00:34:15.812 "nbd_device": "/dev/nbd0", 00:34:15.812 "bdev_name": "crypto_ram" 00:34:15.812 }, 00:34:15.812 { 00:34:15.812 "nbd_device": "/dev/nbd1", 00:34:15.812 "bdev_name": "crypto_ram1" 00:34:15.812 }, 00:34:15.812 { 00:34:15.812 "nbd_device": "/dev/nbd2", 00:34:15.812 "bdev_name": "crypto_ram2" 00:34:15.812 }, 00:34:15.812 { 00:34:15.812 "nbd_device": "/dev/nbd3", 00:34:15.812 "bdev_name": "crypto_ram3" 00:34:15.812 } 00:34:15.812 ]' 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:15.812 10:41:52 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:16.070 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:16.328 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:16.586 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:16.856 10:41:53 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:16.856 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:17.114 /dev/nbd0 00:34:17.114 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:17.114 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:17.114 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:17.114 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:17.114 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:17.114 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:17.114 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:17.373 1+0 records in 00:34:17.373 1+0 records out 00:34:17.373 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300245 s, 13.6 MB/s 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:34:17.373 /dev/nbd1 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:17.373 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:17.374 1+0 records in 00:34:17.374 1+0 records out 00:34:17.374 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308358 s, 13.3 MB/s 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:17.374 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:34:17.632 /dev/nbd10 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:17.632 1+0 records in 00:34:17.632 1+0 records out 00:34:17.632 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297369 s, 13.8 MB/s 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:17.632 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:17.891 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:17.891 10:41:54 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:17.891 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:17.891 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:17.891 10:41:54 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:34:17.891 /dev/nbd11 00:34:17.891 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:18.150 1+0 records in 00:34:18.150 1+0 records out 00:34:18.150 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000360377 s, 11.4 MB/s 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:18.150 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd0", 00:34:18.409 "bdev_name": "crypto_ram" 00:34:18.409 }, 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd1", 00:34:18.409 "bdev_name": "crypto_ram1" 00:34:18.409 }, 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd10", 00:34:18.409 "bdev_name": "crypto_ram2" 00:34:18.409 }, 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd11", 00:34:18.409 "bdev_name": "crypto_ram3" 00:34:18.409 } 00:34:18.409 ]' 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd0", 00:34:18.409 "bdev_name": "crypto_ram" 00:34:18.409 }, 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd1", 00:34:18.409 "bdev_name": "crypto_ram1" 00:34:18.409 }, 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd10", 00:34:18.409 "bdev_name": "crypto_ram2" 00:34:18.409 }, 00:34:18.409 { 00:34:18.409 "nbd_device": "/dev/nbd11", 00:34:18.409 "bdev_name": "crypto_ram3" 00:34:18.409 } 00:34:18.409 ]' 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:18.409 /dev/nbd1 00:34:18.409 /dev/nbd10 00:34:18.409 /dev/nbd11' 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:18.409 /dev/nbd1 00:34:18.409 /dev/nbd10 00:34:18.409 /dev/nbd11' 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:18.409 256+0 records in 00:34:18.409 256+0 records out 00:34:18.409 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106851 s, 98.1 MB/s 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:18.409 256+0 records in 00:34:18.409 256+0 records out 00:34:18.409 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0831941 s, 12.6 MB/s 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:18.409 256+0 records in 00:34:18.409 256+0 records out 00:34:18.409 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0665035 s, 15.8 MB/s 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:18.409 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:34:18.668 256+0 records in 00:34:18.668 256+0 records out 00:34:18.668 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0502362 s, 20.9 MB/s 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:34:18.668 256+0 records in 00:34:18.668 256+0 records out 00:34:18.668 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0401148 s, 26.1 MB/s 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:18.668 10:41:55 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:18.927 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:19.186 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:19.444 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:19.703 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:19.703 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:19.703 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:19.961 10:41:56 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:19.961 malloc_lvol_verify 00:34:19.961 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:20.220 ee654532-ac7a-4e28-9b90-86ff618dce0f 00:34:20.220 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:20.479 1652da97-e2c9-4b10-aeb5-da28c952600f 00:34:20.479 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:20.737 /dev/nbd0 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:20.737 mke2fs 1.46.5 (30-Dec-2021) 00:34:20.737 Discarding device blocks: 0/4096 done 00:34:20.737 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:20.737 00:34:20.737 Allocating group tables: 0/1 done 00:34:20.737 Writing inode tables: 0/1 done 00:34:20.737 Creating journal (1024 blocks): done 00:34:20.737 Writing superblocks and filesystem accounting information: 0/1 done 00:34:20.737 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:20.737 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 672452 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 672452 ']' 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 672452 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:20.996 10:41:57 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 672452 00:34:20.996 10:41:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:20.996 10:41:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:20.996 10:41:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 672452' 00:34:20.996 killing process with pid 672452 00:34:20.996 10:41:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 672452 00:34:20.996 10:41:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 672452 00:34:21.254 10:41:58 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:34:21.254 00:34:21.254 real 0m9.882s 00:34:21.254 user 0m12.718s 00:34:21.254 sys 0m3.984s 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:21.513 ************************************ 00:34:21.513 END TEST bdev_nbd 00:34:21.513 ************************************ 00:34:21.513 10:41:58 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:21.513 10:41:58 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:34:21.513 10:41:58 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:34:21.513 10:41:58 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:34:21.513 10:41:58 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:34:21.513 10:41:58 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:21.513 10:41:58 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:21.513 10:41:58 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:21.513 ************************************ 00:34:21.513 START TEST bdev_fio 00:34:21.513 ************************************ 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:21.513 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:21.513 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:21.514 ************************************ 00:34:21.514 START TEST bdev_fio_rw_verify 00:34:21.514 ************************************ 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:21.514 10:41:58 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:22.081 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:22.081 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:22.081 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:22.081 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:22.081 fio-3.35 00:34:22.081 Starting 4 threads 00:34:36.987 00:34:36.987 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=674591: Mon Jul 15 10:42:11 2024 00:34:36.987 read: IOPS=21.3k, BW=83.0MiB/s (87.1MB/s)(830MiB/10001msec) 00:34:36.987 slat (usec): min=10, max=1179, avg=63.92, stdev=45.05 00:34:36.987 clat (usec): min=22, max=2026, avg=352.91, stdev=275.71 00:34:36.987 lat (usec): min=52, max=2116, avg=416.83, stdev=306.62 00:34:36.987 clat percentiles (usec): 00:34:36.987 | 50.000th=[ 265], 99.000th=[ 1336], 99.900th=[ 1762], 99.990th=[ 1844], 00:34:36.987 | 99.999th=[ 1876] 00:34:36.987 write: IOPS=23.4k, BW=91.6MiB/s (96.0MB/s)(895MiB/9767msec); 0 zone resets 00:34:36.987 slat (usec): min=18, max=425, avg=77.22, stdev=45.09 00:34:36.987 clat (usec): min=20, max=2013, avg=398.25, stdev=285.35 00:34:36.987 lat (usec): min=53, max=2248, avg=475.47, stdev=315.33 00:34:36.987 clat percentiles (usec): 00:34:36.987 | 50.000th=[ 322], 99.000th=[ 1385], 99.900th=[ 1860], 99.990th=[ 1942], 00:34:36.987 | 99.999th=[ 2008] 00:34:36.987 bw ( KiB/s): min=70616, max=139536, per=96.82%, avg=90805.89, stdev=4766.57, samples=76 00:34:36.987 iops : min=17654, max=34884, avg=22701.47, stdev=1191.64, samples=76 00:34:36.987 lat (usec) : 50=0.04%, 100=4.60%, 250=36.71%, 500=35.14%, 750=13.29% 00:34:36.987 lat (usec) : 1000=5.50% 00:34:36.987 lat (msec) : 2=4.72%, 4=0.01% 00:34:36.987 cpu : usr=99.58%, sys=0.01%, ctx=86, majf=0, minf=268 00:34:36.987 IO depths : 1=4.1%, 2=27.4%, 4=54.8%, 8=13.7%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:36.987 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:36.987 complete : 0=0.0%, 4=88.0%, 8=12.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:36.987 issued rwts: total=212564,229012,0,0 short=0,0,0,0 dropped=0,0,0,0 00:34:36.987 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:36.987 00:34:36.987 Run status group 0 (all jobs): 00:34:36.987 READ: bw=83.0MiB/s (87.1MB/s), 83.0MiB/s-83.0MiB/s (87.1MB/s-87.1MB/s), io=830MiB (871MB), run=10001-10001msec 00:34:36.987 WRITE: bw=91.6MiB/s (96.0MB/s), 91.6MiB/s-91.6MiB/s (96.0MB/s-96.0MB/s), io=895MiB (938MB), run=9767-9767msec 00:34:36.987 00:34:36.987 real 0m13.526s 00:34:36.987 user 0m46.024s 00:34:36.987 sys 0m0.487s 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:34:36.987 ************************************ 00:34:36.987 END TEST bdev_fio_rw_verify 00:34:36.987 ************************************ 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "18dc96c5-f49b-5ab1-96d1-b74793753fdd"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "18dc96c5-f49b-5ab1-96d1-b74793753fdd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9fc54cd8-9ee7-5a4b-b557-52b732dce01f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9fc54cd8-9ee7-5a4b-b557-52b732dce01f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "55d2336a-82b5-5842-821c-ea5678d01b82"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "55d2336a-82b5-5842-821c-ea5678d01b82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b6c7c11e-5b6d-5164-b8bc-20dcd2f061df"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b6c7c11e-5b6d-5164-b8bc-20dcd2f061df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:34:36.987 crypto_ram1 00:34:36.987 crypto_ram2 00:34:36.987 crypto_ram3 ]] 00:34:36.987 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "18dc96c5-f49b-5ab1-96d1-b74793753fdd"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "18dc96c5-f49b-5ab1-96d1-b74793753fdd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9fc54cd8-9ee7-5a4b-b557-52b732dce01f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9fc54cd8-9ee7-5a4b-b557-52b732dce01f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "55d2336a-82b5-5842-821c-ea5678d01b82"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "55d2336a-82b5-5842-821c-ea5678d01b82",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "b6c7c11e-5b6d-5164-b8bc-20dcd2f061df"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "b6c7c11e-5b6d-5164-b8bc-20dcd2f061df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:36.988 ************************************ 00:34:36.988 START TEST bdev_fio_trim 00:34:36.988 ************************************ 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:36.988 10:42:12 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:36.988 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:36.988 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:36.988 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:36.988 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:36.988 fio-3.35 00:34:36.988 Starting 4 threads 00:34:49.206 00:34:49.206 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=676843: Mon Jul 15 10:42:25 2024 00:34:49.206 write: IOPS=33.0k, BW=129MiB/s (135MB/s)(1288MiB/10001msec); 0 zone resets 00:34:49.206 slat (usec): min=18, max=1520, avg=72.18, stdev=45.29 00:34:49.206 clat (usec): min=19, max=1784, avg=251.02, stdev=167.98 00:34:49.206 lat (usec): min=63, max=1834, avg=323.19, stdev=199.77 00:34:49.206 clat percentiles (usec): 00:34:49.206 | 50.000th=[ 215], 99.000th=[ 865], 99.900th=[ 1057], 99.990th=[ 1123], 00:34:49.206 | 99.999th=[ 1532] 00:34:49.206 bw ( KiB/s): min=117280, max=161346, per=100.00%, avg=131983.26, stdev=3558.94, samples=76 00:34:49.206 iops : min=29320, max=40336, avg=32995.79, stdev=889.73, samples=76 00:34:49.206 trim: IOPS=33.0k, BW=129MiB/s (135MB/s)(1288MiB/10001msec); 0 zone resets 00:34:49.206 slat (usec): min=5, max=100, avg=19.55, stdev= 7.70 00:34:49.206 clat (usec): min=20, max=1834, avg=323.40, stdev=199.83 00:34:49.206 lat (usec): min=39, max=1850, avg=342.95, stdev=203.59 00:34:49.206 clat percentiles (usec): 00:34:49.206 | 50.000th=[ 273], 99.000th=[ 1037], 99.900th=[ 1270], 99.990th=[ 1352], 00:34:49.206 | 99.999th=[ 1696] 00:34:49.206 bw ( KiB/s): min=117280, max=161346, per=100.00%, avg=131982.84, stdev=3558.89, samples=76 00:34:49.206 iops : min=29320, max=40336, avg=32995.79, stdev=889.73, samples=76 00:34:49.206 lat (usec) : 20=0.01%, 50=0.03%, 100=7.15%, 250=44.79%, 500=37.28% 00:34:49.206 lat (usec) : 750=6.59%, 1000=3.23% 00:34:49.206 lat (msec) : 2=0.92% 00:34:49.206 cpu : usr=99.53%, sys=0.00%, ctx=49, majf=0, minf=90 00:34:49.206 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:34:49.206 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:49.206 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:34:49.206 issued rwts: total=0,329839,329840,0 short=0,0,0,0 dropped=0,0,0,0 00:34:49.206 latency : target=0, window=0, percentile=100.00%, depth=8 00:34:49.206 00:34:49.206 Run status group 0 (all jobs): 00:34:49.206 WRITE: bw=129MiB/s (135MB/s), 129MiB/s-129MiB/s (135MB/s-135MB/s), io=1288MiB (1351MB), run=10001-10001msec 00:34:49.206 TRIM: bw=129MiB/s (135MB/s), 129MiB/s-129MiB/s (135MB/s-135MB/s), io=1288MiB (1351MB), run=10001-10001msec 00:34:49.206 00:34:49.206 real 0m13.495s 00:34:49.206 user 0m45.766s 00:34:49.206 sys 0m0.514s 00:34:49.206 10:42:25 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:49.206 10:42:25 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:34:49.206 ************************************ 00:34:49.206 END TEST bdev_fio_trim 00:34:49.206 ************************************ 00:34:49.206 10:42:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:34:49.207 10:42:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:34:49.207 10:42:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:49.207 10:42:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:34:49.207 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:49.207 10:42:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:34:49.207 00:34:49.207 real 0m27.335s 00:34:49.207 user 1m31.944s 00:34:49.207 sys 0m1.182s 00:34:49.207 10:42:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:49.207 10:42:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:49.207 ************************************ 00:34:49.207 END TEST bdev_fio 00:34:49.207 ************************************ 00:34:49.207 10:42:25 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:49.207 10:42:25 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:49.207 10:42:25 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:49.207 10:42:25 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:49.207 10:42:25 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:49.207 10:42:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:49.207 ************************************ 00:34:49.207 START TEST bdev_verify 00:34:49.207 ************************************ 00:34:49.207 10:42:25 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:34:49.207 [2024-07-15 10:42:25.991104] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:34:49.207 [2024-07-15 10:42:25.991166] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid678107 ] 00:34:49.207 [2024-07-15 10:42:26.120213] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:49.207 [2024-07-15 10:42:26.222671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:49.207 [2024-07-15 10:42:26.222676] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:49.207 [2024-07-15 10:42:26.244039] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:49.207 [2024-07-15 10:42:26.252068] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:49.207 [2024-07-15 10:42:26.260095] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:49.207 [2024-07-15 10:42:26.371945] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:34:51.740 [2024-07-15 10:42:28.585716] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:34:51.740 [2024-07-15 10:42:28.585805] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:51.740 [2024-07-15 10:42:28.585821] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:51.740 [2024-07-15 10:42:28.593734] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:34:51.740 [2024-07-15 10:42:28.593754] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:51.740 [2024-07-15 10:42:28.593766] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:51.740 [2024-07-15 10:42:28.601758] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:34:51.740 [2024-07-15 10:42:28.601775] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:51.740 [2024-07-15 10:42:28.601787] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:51.740 [2024-07-15 10:42:28.609777] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:34:51.740 [2024-07-15 10:42:28.609794] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:51.740 [2024-07-15 10:42:28.609805] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:51.740 Running I/O for 5 seconds... 00:34:57.011 00:34:57.011 Latency(us) 00:34:57.011 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:57.011 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x0 length 0x1000 00:34:57.011 crypto_ram : 5.07 496.67 1.94 0.00 0.00 256660.88 4160.11 196949.93 00:34:57.011 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x1000 length 0x1000 00:34:57.011 crypto_ram : 5.08 504.37 1.97 0.00 0.00 253375.28 4786.98 196038.12 00:34:57.011 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x0 length 0x1000 00:34:57.011 crypto_ram1 : 5.08 498.10 1.95 0.00 0.00 255330.05 4445.05 180537.43 00:34:57.011 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x1000 length 0x1000 00:34:57.011 crypto_ram1 : 5.08 504.09 1.97 0.00 0.00 252673.30 5185.89 179625.63 00:34:57.011 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x0 length 0x1000 00:34:57.011 crypto_ram2 : 5.04 3859.05 15.07 0.00 0.00 32880.61 7009.50 31001.38 00:34:57.011 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x1000 length 0x1000 00:34:57.011 crypto_ram2 : 5.05 3890.43 15.20 0.00 0.00 32595.48 3960.65 31001.38 00:34:57.011 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x0 length 0x1000 00:34:57.011 crypto_ram3 : 5.06 3872.21 15.13 0.00 0.00 32670.06 4274.09 29063.79 00:34:57.011 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:34:57.011 Verification LBA range: start 0x1000 length 0x1000 00:34:57.011 crypto_ram3 : 5.06 3897.48 15.22 0.00 0.00 32456.97 3932.16 28949.82 00:34:57.011 =================================================================================================================== 00:34:57.011 Total : 17522.40 68.45 0.00 0.00 58119.27 3932.16 196949.93 00:34:57.270 00:34:57.270 real 0m8.273s 00:34:57.270 user 0m15.672s 00:34:57.270 sys 0m0.397s 00:34:57.270 10:42:34 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:57.270 10:42:34 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:34:57.270 ************************************ 00:34:57.270 END TEST bdev_verify 00:34:57.270 ************************************ 00:34:57.270 10:42:34 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:34:57.270 10:42:34 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:57.270 10:42:34 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:34:57.270 10:42:34 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:57.270 10:42:34 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:34:57.270 ************************************ 00:34:57.270 START TEST bdev_verify_big_io 00:34:57.270 ************************************ 00:34:57.270 10:42:34 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:34:57.270 [2024-07-15 10:42:34.348906] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:34:57.270 [2024-07-15 10:42:34.348971] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid679171 ] 00:34:57.529 [2024-07-15 10:42:34.479309] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 2 00:34:57.529 [2024-07-15 10:42:34.581535] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:57.529 [2024-07-15 10:42:34.581541] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:57.529 [2024-07-15 10:42:34.602950] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:34:57.529 [2024-07-15 10:42:34.610982] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:57.529 [2024-07-15 10:42:34.619006] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:57.789 [2024-07-15 10:42:34.738632] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:00.322 [2024-07-15 10:42:36.966776] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:00.322 [2024-07-15 10:42:36.966854] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:00.322 [2024-07-15 10:42:36.966871] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:00.322 [2024-07-15 10:42:36.974796] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:00.322 [2024-07-15 10:42:36.974815] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:00.322 [2024-07-15 10:42:36.974827] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:00.322 [2024-07-15 10:42:36.982820] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:00.322 [2024-07-15 10:42:36.982839] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:00.322 [2024-07-15 10:42:36.982851] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:00.322 [2024-07-15 10:42:36.990843] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:00.322 [2024-07-15 10:42:36.990860] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:00.322 [2024-07-15 10:42:36.990872] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:00.322 Running I/O for 5 seconds... 00:35:00.889 [2024-07-15 10:42:37.981865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.889 [2024-07-15 10:42:37.983596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.889 [2024-07-15 10:42:37.983689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.889 [2024-07-15 10:42:37.983750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.889 [2024-07-15 10:42:37.983806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.983849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.984200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.984218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.987719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.987767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.987810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.987857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.988299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.988343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.988388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.988429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.988903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.988920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.992415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.992472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.992513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.992555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.992997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.993040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.993085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.993127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.993556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.993576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.996919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.996970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.997011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.997052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.997548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.997596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.997637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.997680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.998107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:37.998124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.001317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.001363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.001405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.001451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.001948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.001992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.002036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.002079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.002453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.002471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.005924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.005973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.006015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.006060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.006518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.006562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.006610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.006654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.007150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.007171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.010522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.010567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.010611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.010654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.011078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.011126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.011167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.011208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.011634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.011651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.015813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.016239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.016257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.019268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.019314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.019354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.019396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.019883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.019932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.019975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.020017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.020415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.020433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.023705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.023751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.023792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.023837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.024297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.024343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.024392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.024444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.024815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.024832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.028009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.028056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.028097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.028139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.028583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.890 [2024-07-15 10:42:38.028641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.028695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.028737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.029130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.029149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.032578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.032626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.032679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.032723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.033207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.033251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.033307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.033363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.033744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.033761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.037833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.038324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.038343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.041591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.041652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.041708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.041752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.042155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.042199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.042240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.042281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.042703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.042723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.045735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.045781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.045822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.045876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.046402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.046447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.046490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.046532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.046962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.046980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.050796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.051157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.051174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.054993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.055473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.055492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.058576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.058622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.058663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.058704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.059157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.059202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.059244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.059285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.059703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.059720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.062728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.062775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.062816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.062858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.063349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.063395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.063436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.063480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.063936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.063955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.066920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.066976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.067017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.067061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.067524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.067570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.067613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.067657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.068034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.068055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.071112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.071160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.071201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.071242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.071677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.891 [2024-07-15 10:42:38.071733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.071774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.071815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.072161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.072180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.075251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.075300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.075353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.075397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.075855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.075902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.075980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.076033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.076497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.076519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.079494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.079541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.079583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.079626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.080075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.080119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.080177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.080230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.080719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.080739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.083656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.083719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.083763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.083804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.084253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.084296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.084338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.084379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.084813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:00.892 [2024-07-15 10:42:38.084832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.087662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.087720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.087762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.087803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.088316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.088360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.088401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.088445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.088868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.088889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.092780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.093255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.093272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.154 [2024-07-15 10:42:38.096007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.096053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.096094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.096137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.096551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.096594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.096635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.096675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.097089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.097107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.099846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.099892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.099951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.099993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.100535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.100580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.100621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.100663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.101113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.101137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.103793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.103838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.103883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.103931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.104400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.104444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.104485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.104529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.104903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.104921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.107769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.107816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.107857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.107898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.108274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.108318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.108358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.108399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.108707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.108723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.110624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.110669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.110709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.110749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.111060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.111104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.111144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.111184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.111448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.111464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.114763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.115110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.115126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.118749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.120420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.122020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.122409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.123235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.123623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.124014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.125578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.125853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.125869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.129282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.130905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.131679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.132069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.132897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.133292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.134545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.135899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.136177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.136195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.139630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.141148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.141541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.141930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.142705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.143246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.144644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.146251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.155 [2024-07-15 10:42:38.146524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.146541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.149993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.150671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.151065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.151448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.152323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.153756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.155108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.156711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.156989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.157006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.160238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.160629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.161016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.161407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.162558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.163908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.165516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.167123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.167397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.167414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.169768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.170165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.170550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.170941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.172932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.174318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.175889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.177478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.177978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.177995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.180086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.180476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.180864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.181258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.182958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.184565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.186173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.187366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.187642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.187659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.189866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.190259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.190652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.191268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.193104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.194745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.196457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.197353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.197705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.197722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.200063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.200460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.200844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.202380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.204299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.205907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.206849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.208607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.208884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.208901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.211410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.211806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.212576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.213942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.216034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.217738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.218785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.220141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.220416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.220432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.223054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.223444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.225088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.226531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.228419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.229265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.230949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.232633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.232922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.232944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.235564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.236525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.237885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.156 [2024-07-15 10:42:38.239489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.241303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.242490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.243840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.245453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.245727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.245744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.248555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.250232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.251962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.253769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.254767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.256166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.257773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.259386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.259662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.259679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.263343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.264694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.266291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.267895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.269764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.271115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.272704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.274298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.274724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.274741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.278996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.280660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.282271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.283767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.285433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.287045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.288654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.289856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.290305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.290323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.294068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.295679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.297291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.298019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.299755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.301371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.302993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.303390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.303856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.303874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.307714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.309317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.310613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.312068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.314010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.315612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.316601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.317001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.317434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.317452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.321141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.322742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.323467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.324809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.326717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.328430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.328827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.329217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.329611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.329627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.333251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.334585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.336014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.337368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.339257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.340287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.340681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.341072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.341528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.341546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.344965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.345678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.347022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.157 [2024-07-15 10:42:38.348625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.460 [2024-07-15 10:42:38.350556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.350960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.351349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.351737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.352203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.352222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.355523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.356664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.357978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.359580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.361254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.361654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.362045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.362432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.362866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.362884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.365532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.367184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.368584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.370139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.371430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.371823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.372213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.372610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.373054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.373071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.375309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.376804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.378504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.380142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.380828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.381223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.381608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.381996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.382366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.382382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.385115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.386476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.388084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.389689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.390529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.390916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.391308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.391700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.391977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.391994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.395127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.396786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.398398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.399915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.400738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.401131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.401521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.402772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.403087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.403104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.406074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.407668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.409276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.409902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.410767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.411165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.411623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.413098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.413371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.413388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.416714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.418332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.419735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.420127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.420948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.421340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.422697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.424049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.424325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.424341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.427559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.429172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.429741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.430133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.430962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.431352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.432965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.434764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.435044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.435061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.438393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.440005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.440412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.440800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.441605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.442856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.444198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.445718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.446053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.446072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.448067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.448457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.448845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.449254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.450000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.450396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.450784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.451189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.451656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.451674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.454516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.454912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.455314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.455363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.456127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.456514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.456902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.457292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.457751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.457768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.460553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.460955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.461349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.461737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.461784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.462230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.462629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.463024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.463414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.463809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.464259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.464277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.466854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.466899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.466944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.466985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.461 [2024-07-15 10:42:38.467421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.467474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.467516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.467558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.467599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.468037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.468054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.470297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.470342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.470384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.470425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.470861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.470918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.470964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.471006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.471046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.471497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.471514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.473878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.473923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.473970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.474012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.474430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.474482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.474525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.474567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.474608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.475050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.475068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.477477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.477525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.477570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.477611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.478070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.478124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.478167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.478208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.478251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.478632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.478649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.480978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.481742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.482140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.482158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.484546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.484594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.484635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.484678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.485093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.485157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.485200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.485241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.485283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.485653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.485675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.488984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.489451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.489468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.491722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.491769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.491814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.491856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.492234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.492298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.492340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.492382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.492427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.492957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.492975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.495270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.495328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.495381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.495433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.495874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.495936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.495977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.496018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.496063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.496500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.496518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.498737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.498782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.498826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.498869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.499382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.499436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.499477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.499519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.499561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.499973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.499989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.502393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.502437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.502478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.502520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.502965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.503020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.503061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.503103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.503144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.503632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.503649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.505988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.506708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.507149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.507167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.509463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.509509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.509549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.509604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.510057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.462 [2024-07-15 10:42:38.510117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.510160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.510201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.510242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.510653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.510670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.512922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.512970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.513011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.513051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.513507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.513560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.513604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.513646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.513687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.514034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.514051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.516421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.516466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.516512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.516557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.517001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.517054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.517107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.517149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.517202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.517596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.517613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.520749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.521130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.521147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.523768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.523824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.523866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.523920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.524309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.524375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.524435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.524489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.524531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.524949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.524967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.527346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.527390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.527431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.527474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.527877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.527945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.527999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.528041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.528082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.528557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.528574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.530910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.530985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.531037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.531078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.531484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.531535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.531576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.531619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.531660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.532125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.532146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.534542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.534598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.534644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.534686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.535161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.535215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.535257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.535299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.535343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.535765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.535782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.538778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.539260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.539278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.541577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.541622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.541663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.541704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.542122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.542175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.542216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.542261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.542304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.542753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.542770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.545880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.546279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.546295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.548964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.549437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.549454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.551162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.551226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.463 [2024-07-15 10:42:38.551267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.551308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.551575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.551638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.551681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.551722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.551763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.552035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.552053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.553753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.553798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.553838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.553878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.554331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.554386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.554428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.554471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.554512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.554898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.554915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.556751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.556795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.556835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.556875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.557145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.557205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.557246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.557287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.557327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.557730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.557746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.559287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.559340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.559385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.559426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.559858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.559910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.559958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.560001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.560043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.560463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.560480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.562633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.562679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.562730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.562775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.563048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.563112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.563156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.563196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.563237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.563507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.563523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.565843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.566288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.566306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.568417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.568461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.570977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.572500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.572545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.572593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.572989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.573426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.573479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.573522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.573563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.573604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.574017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.574034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.577422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.578438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.580210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.581771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.582047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.583679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.584331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.584720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.585111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.585593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.585611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.588971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.589919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.591288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.592896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.593170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.594658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.595048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.595433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.595824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.596278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.596297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.598815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.600603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.602188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.603842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.604116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.464 [2024-07-15 10:42:38.604797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.605191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.605579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.605967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.606358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.606375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.608748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.610074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.611665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.613267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.613562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.613974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.614362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.614749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.615140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.615411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.615428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.618746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.620277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.621889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.623580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.624033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.624443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.624831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.625224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.626210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.626525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.626542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.629476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.631087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.632701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.633735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.634182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.634583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.634976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.635364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.637109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.637383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.637400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.640571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.642224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.643954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.644353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.644825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.645233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.645622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.646688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.648040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.648325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.648342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.651477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.653081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.654000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.654396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.465 [2024-07-15 10:42:38.654842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.655249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.655640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.657243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.659036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.659314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.659331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.662636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.664228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.664617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.665010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.665434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.665834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.667051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.724 [2024-07-15 10:42:38.668389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.669989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.670261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.670278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.673540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.674443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.674837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.675231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.675704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.676112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.677688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.679457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.681153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.681426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.681443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.684742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.685147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.685536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.685921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.686372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.687657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.689006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.690593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.692176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.692587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.692604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.694908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.695312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.695700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.696094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.696501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.697993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.699659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.701265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.702807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.703158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.703176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.705078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.705468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.705852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.706247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.706520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.707881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.709482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.711087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.711838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.712122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.712140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.714123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.714514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.714902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.715789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.716135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.717838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.719439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.720975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.722222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.722540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.722556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.724649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.725049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.725436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.727129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.727458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.729076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.730673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.731443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.733044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.733318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.733335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.735609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.736009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.736743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.738083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.738355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.740173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.741876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.742946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.744300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.744573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.744591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.746919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.747316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.748868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.750224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.750497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.752131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.752968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.754592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.756372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.756646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.756662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.759165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.759873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.761242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.762860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.763140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.764830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.765949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.725 [2024-07-15 10:42:38.767292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.768901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.769184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.769201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.771742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.773264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.774608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.776201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.776476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.777441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.779179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.780836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.782540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.782815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.782831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.785776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.787142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.788754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.790353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.790625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.791630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.792964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.794570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.796178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.796547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.796564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.800350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.801686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.803297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.804896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.805281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.807088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.808750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.810481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.812284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.812689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.812706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.816401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.818015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.819612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.821024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.821349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.822705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.824309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.825907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.826997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.827420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.827437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.830952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.832542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.834133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.834888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.835168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.836558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.838156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.839750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.840172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.840651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.840670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.844571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.846295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.847913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.849062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.849404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.851025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.852609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.853950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.854344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.854783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.854808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.858445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.860064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.861094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.862833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.863149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.864775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.866392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.867047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.867438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.867877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.867894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.871509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.873184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.874028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.875385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.875657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.877334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.878910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.879303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.879691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.880095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.880112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.883304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.883983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.885610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.887216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.887491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.887898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.888295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.888683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.726 [2024-07-15 10:42:38.889082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.889422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.889438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.892055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.892451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.892847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.893248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.893689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.894109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.894499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.894890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.895296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.895790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.895808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.898504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.898895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.899286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.899671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.900110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.900520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.900917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.901314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.901702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.902132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.902150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.904841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.905239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.905629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.906028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.906448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.906850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.907247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.907636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.908032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.908386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.908403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.911225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.911622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.911670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.912074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.912511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.912907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.913302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.913698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.914101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.914600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.914621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.917280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.917673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.918064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.918111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.918557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.918966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.919368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.919759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.920157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.920549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.727 [2024-07-15 10:42:38.920566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.922804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.922850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.922890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.922942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.923373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.923426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.923467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.923510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.923551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.923998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.924015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.926299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.926345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.926386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.926428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.926862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.926932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.926974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.927015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.927055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.927503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.927520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.929837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.929883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.929931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.929973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.930381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.930432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.930473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.930514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.930556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.930993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.931012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.933304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.933355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.933397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.933441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.933933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.933988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.934031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.934073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.934114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.934514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.934531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.936895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.936948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.936989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.937029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.937465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.937521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.937564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.937606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.937647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.937995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.938012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.940382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.940431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.940472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.940513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.940956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.941010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.941064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.941105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.941160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.941536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.941555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.943923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.943975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.944016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.944058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.944442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.944510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.944564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.944604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.988 [2024-07-15 10:42:38.944645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.944989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.945008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.947392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.947439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.947484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.947525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.947871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.947940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.947982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.948037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.948078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.948469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.948486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.951319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.951366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.951422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.951477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.951879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.951951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.952008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.952055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.952097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.952484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.952500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.954825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.954871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.954912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.954958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.955326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.955388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.955431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.955487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.955528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.956029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.956048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.958378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.958435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.958486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.958542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.959003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.959066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.959107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.959147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.959188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.959628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.959646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.961894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.961951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.961998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.962039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.962463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.962516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.962558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.962600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.962643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.963072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.963090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.965487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.965534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.965575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.965615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.966068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.966124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.966166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.966208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.966249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.966652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.966669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.968948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.968993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.969034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.969077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.969508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.969560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.969602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.969644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.969697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.970184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.970202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.972473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.972520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.972566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.972607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.973024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.973077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.973118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.973160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.973200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.973624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.973641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.975960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.976689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.989 [2024-07-15 10:42:38.977135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.977153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.979445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.979495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.979538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.979580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.980058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.980119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.980161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.980202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.980243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.980662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.980686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.982571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.982617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.982657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.982699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.983140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.983193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.983235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.983281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.983340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.983811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.983829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.986766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.987227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.987245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.988940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.988985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.989876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.991522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.991569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.991610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.991651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.992092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.992148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.992190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.992231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.992273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.992662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.992680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.994556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.994600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.994640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.994680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.994951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.995009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.995050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.995091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.995131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.995528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.995545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.997735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.998179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:38.998197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.000781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.001053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.001070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.002743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.002806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.002850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.002890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.003158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.003219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.003262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.003303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.990 [2024-07-15 10:42:39.003344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.003812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.003829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.006951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.008611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.008655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.008696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.008735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.009005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.009065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.009106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.009146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.009186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.009531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.009548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.012940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.014593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.014643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.014683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.014723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.014993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.015056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.015097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.015137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.015177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.015440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.015457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.017773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.017819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.017860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.017901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.018276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.018330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.018371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.018411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.018452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.018753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.018770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.020376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.020420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.021767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.021814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.022087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.022147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.022187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.022228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.022272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.022547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.022564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.024901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.024956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.024998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.025967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.026283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.026342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.026384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.026425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.026465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.026732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.026748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.030003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.031614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.032277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.032665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.033111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.033514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.034041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.035452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.037050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.037324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.037341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.040637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.042256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.042647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.043039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.043419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.043822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.044845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.046191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.047800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.048077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.991 [2024-07-15 10:42:39.048094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.051396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.052263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.052651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.053039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.053526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.053923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.055507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.057268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.058963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.059237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.059253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.062627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.063028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.063417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.063806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.064239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.065239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.066589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.068197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.069805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.070122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.070140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.072578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.072979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.073368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.073755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.074178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.075696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.077393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.079041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.080633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.081059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.081076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.082940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.083336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.083734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.084133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.084443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.085794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.087391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.088998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.089979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.090260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.090277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.092321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.092715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.093112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.093966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.094283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.096036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.097687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.099276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.100435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.100763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.100780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.102821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.103226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.103620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.104976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.105319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.106944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.108551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.109631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.111330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.111661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.111678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.114009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.114405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.115152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.116504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.116778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.118577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.120263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.121346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.122692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.122970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.122988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.125237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.125628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.126875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.128226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.128499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.992 [2024-07-15 10:42:39.130117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.131324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.132891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.134252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.134523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.134540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.137135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.137681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.139073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.140678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.140958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.142695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.143617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.144966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.146567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.146839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.146855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.149341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.150478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.151832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.153444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.153719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.155025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.156507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.157848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.159459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.159734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.159751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.162616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.164026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.165624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.167221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.167496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.168263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.169617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.171201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.172804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.173085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.173103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.176332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.177681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.179271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.180876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.181186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:01.993 [2024-07-15 10:42:39.182469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.183817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.185409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.187003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.187427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.187445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.192004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.193580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.195210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.196891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.197301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.198849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.200559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.202209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.203784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.204173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.204190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.207951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.209563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.211167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.212634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.212967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.214318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.215932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.217535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.218654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.219113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.219130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.222686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.224276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.225879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.226609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.226883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.228529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.230219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.231968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.232366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.254 [2024-07-15 10:42:39.232827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.232845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.236625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.238381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.240054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.241128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.241447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.243079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.244680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.246012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.246397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.246834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.246852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.250518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.252131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.252950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.254579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.254855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.256481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.258081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.258538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.258934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.259359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.259376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.263007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.264732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.265653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.267006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.267281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.268912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.270403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.270789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.271184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.271607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.271624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.275071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.275965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.277645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.279344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.279621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.281280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.281758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.282149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.282535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.283018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.283036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.286327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.287208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.288569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.290180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.290453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.292012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.292401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.292788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.293180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.293616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.293634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.296147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.297903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.299549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.301259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.301536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.302149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.302537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.302932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.303322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.303728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.303745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.306728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.308227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.309750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.310142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.310592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.310990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.311377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.312686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.314036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.314318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.314334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.317634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.319254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.320024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.320413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.320854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.321261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.321652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.322046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.322440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.322936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.322955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.325594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.326012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.326400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.326794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.327231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.327631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.328042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.328434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.328818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.255 [2024-07-15 10:42:39.329304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.329322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.332000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.332395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.332787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.333206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.333597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.334008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.334394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.334788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.335183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.335621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.335637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.338384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.338806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.339204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.339590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.339989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.340386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.340776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.341172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.341567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.341993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.342011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.344646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.345044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.345431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.345815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.346208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.346611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.347004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.347392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.347782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.348236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.348254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.350807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.351201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.351589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.351986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.352389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.352795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.353190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.353578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.353972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.354343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.354360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.357323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.357732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.357791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.358185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.358631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.359050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.359445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.359839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.360241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.360709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.360727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.363338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.363737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.364134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.364183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.364656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.365059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.365452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.365858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.366255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.366670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.366688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.369766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.370206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.370224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.372506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.372554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.372595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.372636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.373085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.373136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.373194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.373235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.373275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.373741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.373759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.256 [2024-07-15 10:42:39.376795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.376836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.377277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.377295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.379452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.379499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.379541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.379601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.380049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.380101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.380143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.380185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.380226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.380635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.380652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.383817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.384175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.384192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.386447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.386493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.386535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.386576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.387018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.387070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.387133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.387175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.387234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.387610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.387627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.390837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.391189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.391206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.393581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.393626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.393672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.393739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.394137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.394200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.394272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.394315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.394385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.394764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.394781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.397401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.397462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.397514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.397558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.397954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.398031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.398079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.398121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.398163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.398561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.398577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.400841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.400885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.400931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.400998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.401393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.401457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.401518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.401572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.401615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.402096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.402114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.404376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.404447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.404499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.404552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.404923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.404978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.405020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.405060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.405101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.405542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.405559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.407715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.407759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.407805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.407849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.408301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.408364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.408406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.257 [2024-07-15 10:42:39.408447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.408489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.408912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.408935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.411974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.412364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.412381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.414691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.414736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.414777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.414819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.415203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.415264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.415308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.415349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.415404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.415902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.415920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.418965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.420653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.420697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.420741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.420781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.421056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.421113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.421154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.421195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.421240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.421613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.421630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.424990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.426638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.426682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.426729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.426773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.427046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.427103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.427150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.427190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.427230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.427496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.427513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.429858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.429904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.429951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.429994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.430266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.430321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.430370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.430412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.430463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.430735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.430751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.432474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.258 [2024-07-15 10:42:39.432519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.432563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.432604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.432871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.432935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.432977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.433021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.433062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.433329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.433346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.435614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.435659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.435701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.435745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.436159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.436218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.436259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.436299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.436339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.436673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.436689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.438921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.439198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.439214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.441327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.441371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.441412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.441452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.441885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.441948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.441990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.442032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.442073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.442345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.442361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.443993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.444869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.446790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.446837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.446880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.446921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.447346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.447412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.447457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.447498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.447540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.447975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.259 [2024-07-15 10:42:39.447993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.449516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.449561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.449615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.449658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.450025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.450079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.450120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.450160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.450201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.450528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.450545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.452410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.452458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.452499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.452541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.452969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.453038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.453083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.453124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.453165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.453597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.453615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.455893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.456169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.456186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.457870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.457916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.457960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.458000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.458437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.458492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.519 [2024-07-15 10:42:39.458535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.458576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.458617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.459010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.459027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.460952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.460996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.461993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.463569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.463616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.464013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.464063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.464500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.464552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.464595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.464636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.464681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.465151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.465168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.466976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.467021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.467063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.468811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.469174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.469229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.469270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.469310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.469352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.469658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.469674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.471866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.472281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.472674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.473644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.473968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.475597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.477206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.478688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.479962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.480282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.480299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.482470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.482866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.483261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.485021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.485317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.486945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.488545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.489263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.490727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.491004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.491021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.493369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.493764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.494665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.496028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.496300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.497931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.499446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.500677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.502019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.502290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.502306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.504710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.505111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.506787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.508269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.508540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.510162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.510946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.512523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.514300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.514573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.514590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.517029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.517733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.519095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.520709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.520983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.522707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.523787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.525145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.526744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.527020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.527037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.529600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.531117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.532461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.534072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.534343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.535306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.520 [2024-07-15 10:42:39.537041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.538622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.540277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.540548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.540565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.543423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.544865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.546481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.548082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.548353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.549154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.550510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.552107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.553706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.553982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.554001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.557551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.558934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.560548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.562153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.562523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.564263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.565822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.567448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.569153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.569619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.569635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.573454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.575067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.576673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.578079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.578394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.579751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.581350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.582956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.584043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.584467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.584484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.588017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.589630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.591232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.591974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.592247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.593757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.595355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.596981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.597378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.597826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.597848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.601520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.603120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.604575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.605885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.606204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.607816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.609409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.610560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.610955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.611400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.611417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.614969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.616584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.617311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.618863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.619139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.620748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.622349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.622764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.623162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.623563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.623580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.627269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.628874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.630049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.631384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.631655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.633278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.634583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.634984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.635371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.635781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.635797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.639235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.640128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.641808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.643512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.643785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.645412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.645961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.646349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.646739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.647243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.647262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.650714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.651793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.653144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.654745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.655021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.656402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.656792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.521 [2024-07-15 10:42:39.657187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.657577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.658006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.658024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.660559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.662267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.663995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.665789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.666067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.666598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.666991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.667380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.667770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.668171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.668189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.670776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.672122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.673726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.675327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.675647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.676058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.676449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.676837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.677231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.677504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.677520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.680887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.682582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.684362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.686083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.686440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.686843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.687239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.687630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.688601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.688922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.688943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.691931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.693523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.695130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.695989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.696451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.696855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.697253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.697643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.699430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.699720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.699736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.703044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.704820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.706560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.706953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.707409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.707811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.708205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.709099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.710439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.710710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.710727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.713938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.522 [2024-07-15 10:42:39.715533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.716506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.716910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.717343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.717746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.718140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.719387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.720993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.721265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.721281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.723632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.724032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.724420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.724806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.725233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.726861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.728643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.730380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.732027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.732447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.732463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.734405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.734803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.735196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.735587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.735998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.736404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.736793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.737186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.737577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.738018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.738036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.740631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.741036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.741423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.741815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.742264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.742666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.743063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.743454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.743850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.744206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.744224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.746974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.747372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.747765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.748158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.748573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.748982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.749372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.749762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.750160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.750595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.750613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.753221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.753617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.754011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.754398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.754772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.755180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.783 [2024-07-15 10:42:39.755570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.755962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.756349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.756786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.756804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.759373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.759768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.760170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.760578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.761060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.761463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.761857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.762247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.762655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.763062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.763080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.765964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.766367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.766769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.767161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.767630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.768041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.768431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.768825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.769242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.769669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.769687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.772376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.772771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.773170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.773560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.773919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.774326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.774727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.775118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.775514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.775953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.775971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.778448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.780221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.780615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.781011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.781392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.781797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.783543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.783951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.784340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.784612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.784629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.788791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.789199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.789248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.789779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.790056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.790460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.790853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.791253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.791703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.791979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.792006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.794727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.795132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.795540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.795588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.795860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.796272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.796659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.798230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.798622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.799061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.799078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.801860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.802209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.802226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.804376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.804420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.804461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.804510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.804996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.805057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.805102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.805145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.805186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.805612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.805629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.807765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.807825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.807884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.807942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.808356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.808410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.808451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.784 [2024-07-15 10:42:39.808491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.808531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.808878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.808898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.811876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.812361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.812378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.814390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.814437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.814477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.814518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.814963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.815020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.815062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.815103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.815144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.815559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.815576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.817726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.817771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.817812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.817853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.818124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.818179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.818220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.818268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.818323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.818831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.818848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.820939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.820995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.821036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.821077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.821524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.821575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.821616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.821659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.821702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.822031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.822048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.824860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.825336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.825353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.827810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.827867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.827931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.827985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.828517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.828597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.828655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.828708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.828749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.829070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.829086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.831888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.832290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.832307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.834682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.834733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.834773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.834813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.835083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.835146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.835189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.835229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.835269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.835723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.835740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.837590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.837635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.837684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.837725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.837999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.838055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.838096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.838136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.785 [2024-07-15 10:42:39.838182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.838644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.838662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.841816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.842185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.842202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.844641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.844687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.844728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.844770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.845146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.845208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.845251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.845292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.845353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.845813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.845831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.848952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.850705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.850749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.850792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.850833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.851107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.851165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.851206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.851253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.851295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.851645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.851662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.855654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.856049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.856066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.861810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.862251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.862270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.867969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.868009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.868273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.868289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.871817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.871872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.871931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.871973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.872241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.872307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.872356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.872397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.872437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.872705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.872721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.877332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.877395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.877437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.877477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.877946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.878003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.878045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.878086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.878127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.786 [2024-07-15 10:42:39.878480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.878496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.882805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.883078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.883095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.887845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.888116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.888133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.892758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.892806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.893184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.893228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.893270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.893667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.966590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.971725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.971791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.973382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.973436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.974999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.975053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.975733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.976006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.976024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:02.787 [2024-07-15 10:42:39.976038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.983425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.985104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.986242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.986577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.986593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.988778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.989178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.989567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.991304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.993179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.994788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.995510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.996999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.997267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.997283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.999599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:39.999992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.001138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.002482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.004365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.005705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.007131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.008453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.008724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.008740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.011889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.013046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.014371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.015975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.017797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.019054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.020394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.022001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.022271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.022287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.024990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.026573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.028327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.030019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.031010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.032357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.033944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.035543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.035816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.035834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.039467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.040808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.042409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.044010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.045982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.047418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.049022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.050628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.051025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.051043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.054766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.056365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.057954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.059357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.061013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.047 [2024-07-15 10:42:40.062644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.064257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.065309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.065741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.065759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.069265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.070882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.072493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.073268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.075252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.077030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.078728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.079127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.079573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.079591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.083272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.084867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.086030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.087653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.089572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.091169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.092080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.092475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.092902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.092920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.096839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.098568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.099486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.100834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.102769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.104358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.104747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.105140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.105544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.105562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.109019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.109940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.111658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.113374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.115252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.115841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.116240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.116626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.117115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.117133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.120585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.121647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.122985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.124586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.126323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.126717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.127109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.127497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.127951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.127971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.130516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.132340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.133944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.135604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.136622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.137020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.137404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.137792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.138209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.138228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.140558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.141911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.143521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.145141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.145854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.146252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.146641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.147030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.147302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.147320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.150348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.151847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.152483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.154067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.154952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.156736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.157130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.157724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.158002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.158020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.160954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.162571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.164184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.165488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.166294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.166684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.167079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.168543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.168858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.168875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.171838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.173471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.175084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.175902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.176710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.048 [2024-07-15 10:42:40.177107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.177492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.177883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.178271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.178290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.181468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.181865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.182262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.182647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.183516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.183917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.184320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.184713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.185134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.185152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.187917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.188317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.188704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.189099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.189979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.190373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.190757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.191148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.191541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.191557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.194247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.194643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.195038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.195429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.196260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.196653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.197050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.197449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.197867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.197885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.200605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.201003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.201388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.201787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.202624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.203025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.203431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.203818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.204333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.204353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.207092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.207481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.207871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.208276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.209200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.209590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.209985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.210375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.210811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.210828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.213614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.214019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.214412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.214801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.215603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.216009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.216406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.216794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.217242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.217261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.219818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.220221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.220608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.220656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.221478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.221872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.222268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.222661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.223159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.223178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.225818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.225868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.226265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.226310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.227120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.227179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.227586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.227632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.228074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.228092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.230669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.230719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.231112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.231157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.231984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.232032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.232425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.232474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.232853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.232871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.236255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.236311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.236697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.236740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.237571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.049 [2024-07-15 10:42:40.237632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.238031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.238090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.238497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.238514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.241269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.241330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.241724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.241770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.242586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.242633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.243028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.243072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.050 [2024-07-15 10:42:40.243421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.243438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.246183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.246234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.246624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.246680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.247429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.247478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.247870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.247913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.248371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.248389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.251112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.308 [2024-07-15 10:42:40.251162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.251549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.251592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.252391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.252448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.252834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.252876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.253330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.253349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.255970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.256021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.256407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.256450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.257299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.257348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.258818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.258867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.259283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.259301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.261828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.261879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.261922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.261965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.262785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.262850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.262894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.263291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.263731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.263749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.266832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.267252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.267270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.269826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.270162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.270179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.271730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.271773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.271813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.271855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.272381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.272425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.272466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.272506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.272945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.272962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.275891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.277593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.277647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.277688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.277728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.278044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.278089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.278131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.278172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.278617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.278633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.280856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.280901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.280956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.280999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.281303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.281345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.281402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.281443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.281717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.281734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.283997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.284492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.284511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.309 [2024-07-15 10:42:40.286939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.286983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.287831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.289454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.289500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.289540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.289580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.289895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.289944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.289991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.290035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.290304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.290324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.292770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.292817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.292859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.292901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.293215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.293258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.293299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.293346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.293619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.293635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.295893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.296162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.296179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.298465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.298511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.298552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.299076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.299123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.299164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.299436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.434518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.441337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.441409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.442803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.442859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.444160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.444216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.445788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.446070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.446086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.446101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.452758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.454428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.456038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.456312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.456328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.459547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.459950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.460335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.460721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.462435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.463800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.465387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.466992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.467432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.467449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.469774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.470169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.470557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.470948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.472979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.474796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.476520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.478168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.478619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.478636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.480518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.480905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.481293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.481682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.483301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.484905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.486517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.487400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.487674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.487691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.489695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.490090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.490479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.491211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.493164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.494922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.496094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.310 [2024-07-15 10:42:40.497072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.497347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.497364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.499367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.499756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.500149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.500754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.502583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.311 [2024-07-15 10:42:40.504224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.505922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.506811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.507175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.507191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.509248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.509639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.510029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.511455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.513407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.515018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.516060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.517811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.518119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.518136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.520337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.520742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.521140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.522753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.524651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.525087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.526693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.528307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.528579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.528596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.532133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.533484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.535087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.536699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.538453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.539794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.541408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.543012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.543413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.543429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.547646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.549432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.551158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.552801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.554881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.556594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.558377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.560072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.560677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.560693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.563507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.563906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.564300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.564347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.565167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.565559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.565960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.566355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.566807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.566825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.569529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.569579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.569972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.570016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.570911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.570974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.571371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.571429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.571789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.571806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.574385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.574442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.574829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.574871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.575700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.575745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.576136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.576180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.576584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.576601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.579276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.579335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.579724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.580127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.580981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.581372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.581418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.581803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.582231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.582248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.584725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.585124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.585518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.571 [2024-07-15 10:42:40.585569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.586401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.586450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.586834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.587225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.587594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.587617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.590321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.590715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.590770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.591167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.591689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.592082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.592469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.592513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.592941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.592958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.595692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.595741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.596133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.596526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.597322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.597710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.597755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.598145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.598494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.598511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.600872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.601267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.601661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.601716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.602501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.602550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.602942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.603326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.603780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.603796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.606548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.606948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.607014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.607408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.607827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.608221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.608609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.608652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.609084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.609102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.611745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.611794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.612180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.612570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.613331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.613718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.613765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.614151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.614599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.614616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.616863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.617259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.617649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.618048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.618811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.618860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.619251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.619635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.620056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.620073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.622461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.622855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.622899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.623293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.623726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.624117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.624172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.624556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.624971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.624988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.627596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.627646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.628051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.628095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.628932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.628997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.629386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.629434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.629905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.629922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.632561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.632612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.633002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.633046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.633841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.633888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.572 [2024-07-15 10:42:40.634286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.634336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.634691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.634709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.637403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.637459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.637850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.637898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.638761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.638807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.639203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.639248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.639674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.639690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.641801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.641850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.642241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.642285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.643123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.643183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.643573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.643622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.644020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.644038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.646816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.646867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.647256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.647299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.648136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.648182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.648564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.648617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.648989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.649006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.653021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.653083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.653473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.653517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.655477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.655528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.657135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.657180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.657452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.657469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.660739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.660789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.661810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.661864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.662720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.662767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.663172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.663217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.663654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.663670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.665898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.665953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.667297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.667342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.669232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.669280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.670584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.670627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.670902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.670918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.673345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.673395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.674990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.675037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.676941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.676990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.678589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.678634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.679007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.679024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.681005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.681055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.681097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.681136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.681945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.681991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.682032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.682415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.682709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.682725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.684929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.685201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.685218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.686874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.686919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.686969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.687011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.687483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.573 [2024-07-15 10:42:40.687527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.687569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.687612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.687875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.687891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.689898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.689947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.689987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.690028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.690339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.690381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.690430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.690476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.690746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.690762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.692421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.692466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.692507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.692547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.692902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.692951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.692992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.693045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.693523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.693540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.695647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.695698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.695738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.695793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.696098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.696159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.696200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.696240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.696509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.696525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.698735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.699145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.699161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.701046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.701097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.701138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.701663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.701707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.701748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.702131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.760792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.764233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.765557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.574 [2024-07-15 10:42:40.765602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.769450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.770284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.770331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.770378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.773020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.773410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.773456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.773496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.773822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.775553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.775619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.775671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.777232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.777276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.777546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.777563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.780531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.780582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.780974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.781018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.781500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.781889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.781937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.782419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.782690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.782707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.788342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.788402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.789905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.789954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.790318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.791273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.791317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.791706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.792000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.835 [2024-07-15 10:42:40.792017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.795561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.795610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.797207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.797252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.797684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.799446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.799498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.801234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.801508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.801524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.807294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.807348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.808954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.808998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.809309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.810697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.810743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.812311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.812616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.812633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.814849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.814898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.815287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.815334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.815642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.816042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.816087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.816752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.817036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.817054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.822772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.822826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.824092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.824137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.824610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.825003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.825049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.825449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.825902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.825919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.828290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.828340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.829820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.829866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.830175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.831951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.832016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.833551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.833886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.833902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.839664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.839718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.841327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.841373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.841686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.842874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.842919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.844705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.845002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.845019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.847249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.847298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.847685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.847722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.848165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.849494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.849540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.851133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.851406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.851422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.856156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.857569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.858107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.858153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.858537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.858811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.858865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.859266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.859731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.859777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.860066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.860084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.863059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.864671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.866283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.867658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.868093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.868493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.868879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.869275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.870782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.871095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.871111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.877502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.878293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.879993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.880380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.880808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.882578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.836 [2024-07-15 10:42:40.882983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.883549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.884921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.885201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.885218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.888602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.890211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.891742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.892135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.892587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.892991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.893381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.894710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.896058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.896329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.896345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.901795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.903555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.903954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.904360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.904637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.905050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.905466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.906996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.908713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.908990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.909007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.912388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.914116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.914506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.914892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.915278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.915677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.916836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.918185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.919772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.920048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.920065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.925919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.926320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.926706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.928415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.928908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.929312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.931058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.932681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.934264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.934650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.934666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.936690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.937103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.937495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.938624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.938958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.940573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.942185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.943499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.944971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.945283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.945299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.951694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.951748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.952145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.952532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.952807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.954217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.955829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.955876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.957599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.958088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.958105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.959634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.960044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.960433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.960477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.960889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.961293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.961339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.962833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.964187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.964467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.964483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.970085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.971409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.971457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.971892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.972335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.972390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.973890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.974342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.974387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.974839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.974856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.977579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.977628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.978024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.978415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.978787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.980217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.980604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.837 [2024-07-15 10:42:40.981423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.981472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.981777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.981794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.984792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.985193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.985586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.985984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.986267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.987171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.987561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.988936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.988989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.989422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.989439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.991635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.992034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.992439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.992826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.993190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.993593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.995293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.995688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.995734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.996177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:40.996195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.000691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.001090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.001480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.001873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.002366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.002794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.004318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.004704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.004750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.005137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.005154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.007284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.007685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.008079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.008468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.008904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.009310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.009706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.010640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.010688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.011039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.011056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.013766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.014167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.014558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.014603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.015082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.015480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.015871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.016270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.016331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.016605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.016621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.019149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.019539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.019953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.020005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.020479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.020875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.021273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.021663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.021729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.022110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.022128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.025418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.025817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.026222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.026276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.026708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.027110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.027497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.027884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.027944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.028373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:03.838 [2024-07-15 10:42:41.028389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.032119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.032512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.032904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.032965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.033412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.033815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.034211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.034595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.034641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.035075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.035093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.040185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.040595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.040642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.041036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.041424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.041827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.042221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.042610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.042658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.099 [2024-07-15 10:42:41.043104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.043122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.045448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.046587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.046636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.047254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.047694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.048103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.048495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.048884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.048937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.049372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.049389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.052175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.052571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.054373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.054424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.054913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.055318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.055708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.055756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.056156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.056628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.056646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.059298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.059350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.060865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.060908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.061356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.061764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.061810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.063179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.063223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.063683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.063701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.067133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.067193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.067583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.067632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.067933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.068952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.068996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.069383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.069426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.069706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.069722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.072297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.072348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.072734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.072782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.073257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.073653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.073703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.074102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.074151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.074548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.074565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.082249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.082304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.082696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.082752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.083026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.083433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.083482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.083872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.083916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.084294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.084311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.087953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.088005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.088392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.088434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.100 [2024-07-15 10:42:41.088766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.090025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.090071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.090459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.090502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.090843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.090861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.093771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.093826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.094221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.094270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.094709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.096140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.096188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.096578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.096622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.097062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.097080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.100395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.100445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.102106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.102158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.102429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.103190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.103237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.104588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.104632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.104900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.104916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.110151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.110204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.110592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.110634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.110905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.112254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.112301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.113898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.113948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.114218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.114235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.117471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.117520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.118331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.119464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.119909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.120582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.120629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.121719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.121762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.122196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.122215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.126684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.126732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.128141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.128187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.128457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.130071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.130118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.130158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.130838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.131109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.131127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.133961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.134001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.134324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.134341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.139211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.139262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.101 [2024-07-15 10:42:41.139302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.139349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.139618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.139672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.139720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.139764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.139806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.140124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.140141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.142858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.143134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.143152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.146869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.146946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.146991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.147841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.149796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.149841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.149882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.149922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.150387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.150446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.150489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.150529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.150576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.151002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.151019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.155897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.155952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.155992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.156761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.158989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.159584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.160070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.160088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.164315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.165673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.165721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.165761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.166037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.166100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.166146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.167757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.167802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.168208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.168226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.170537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.170587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.170629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.171546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.102 [2024-07-15 10:42:41.171897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.171962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.173537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.173582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.173622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.173892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.173908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.178401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.178453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.180248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.180298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.180784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.181189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.181238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.181279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.182676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.183157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.183175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.184859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.186662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.186718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.186764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.187119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.187173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.187215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.187256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.188601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.188874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.188890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.193430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.193482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.193525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.193567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.194004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.194065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.194107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.194147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.195474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.195748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.195764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.199555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.200384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.200663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.200679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.204732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.204791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.204832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.204888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.205162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.205214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.205262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.205306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.206777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.207106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.207122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.209648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.209697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.209738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.209778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.210243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.210302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.210345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.210388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.210777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.211055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.103 [2024-07-15 10:42:41.211072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.215486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.215541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.215586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.216924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.217206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.217265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.217307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.217350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.218951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.219322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.219341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.221282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.221326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.221366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.221753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.222192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.222246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.222290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.222334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.223846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.224126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.224143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.228834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.228884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.228930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.230143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.230434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.230489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.230530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.230570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.231028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.231457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.231475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.233447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.233490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.233533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.235120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.235395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.235454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.235496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.235542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.237289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.237648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.237665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.242157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.242205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.242594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.242640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.242922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.242982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.243024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.243064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.243455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.243901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.243921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.245460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.245504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.246234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.246281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.246557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.246616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.246661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.246701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.248309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.248592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.248608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.252484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.252538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.252580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.254259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.254592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.254656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.254697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.104 [2024-07-15 10:42:41.256274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.256320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.256590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.256606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.260748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.261909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.261960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.262347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.262662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.262717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.263626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.263674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.264067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.264342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.264358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.268123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.269732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.269781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.271514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.271880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.271940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.273231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.273275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.273660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.274019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.274036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.278222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.279382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.279434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.281206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.281482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.281538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.283140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.283187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.284843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.285246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.285265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.290362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.291688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.291735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.293318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.293590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.105 [2024-07-15 10:42:41.293650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.295016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.295061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.296610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.296929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.296946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.302403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.304184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.366 [2024-07-15 10:42:41.304237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.304629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.305060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.305118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.306583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.306629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.308208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.308484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.308501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.313392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.314731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.314778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.315207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.315648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.315701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.317269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.317315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.317707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.318137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.318155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.323104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.324723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.324771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.326360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.326636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.326696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.327842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.327887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.328530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.328964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.328982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.333898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.335506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.335554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.336253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.336527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.336587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.338317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.338365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.339963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.340240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.340256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.344606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.346023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.346071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.346113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.346404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.346462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.347989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.348035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.349633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.350018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.350034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.355192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.356108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.356157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.357016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.357462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.357516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.358815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.360181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.360227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.360500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.360518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.365957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.367556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.367952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.368634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.368908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.369322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.367 [2024-07-15 10:42:41.370031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.371363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.372950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.373224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.373240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.379142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.379956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.380345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.381782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.382179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.382585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.384040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.385388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.386986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.387262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.387278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.392883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.393287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.393983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.395231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.395678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.396417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.397759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.399365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.400967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.401244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.401261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.406493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.406888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.408489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.408883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.409337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.410960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.412387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.413982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.415569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.416010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.416026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.421233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.421962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.423178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.423566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.423956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.425579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.427182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.428621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.429547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.429820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.429836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.434742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.435141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.436827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.438317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.438590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.440212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.440977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.442547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.444323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.444599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.444615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.448884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.449799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.451147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.452762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.453043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.454585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.455829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.456875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.458491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.458770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.368 [2024-07-15 10:42:41.458787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.462521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.462576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.464289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.465824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.466109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.467736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.468387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.468436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.470095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.470374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.470391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.476015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.477734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.478134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.478183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.478586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.479000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.479050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.480104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.480989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.481430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.481454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.486071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.486475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.486523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.488074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.488521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.488582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.488979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.489375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.489430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.489781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.489798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.496616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.496679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.497098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.498610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.499110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.499511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.501124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.501514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.501563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.501984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.502001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.504654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.505061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.505454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.505848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.506296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.507628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.508023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.508962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.509012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.509363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.509380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.513099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.514363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.515044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.515433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.515820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.516232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.517519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.518186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.518235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.518680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.518699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.522797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.369 [2024-07-15 10:42:41.523206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.523593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.525264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.525766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.526171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.526569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.526969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.527017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.527292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.527309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.530089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.530685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.532048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.532437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.532820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.534305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.534696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.535090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.535145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.535585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.535602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.539469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.539868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.540268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.540318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.540612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.541612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.542012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.543266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.543315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.543723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.543740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.549086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.549706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.550102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.550172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.550613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.551024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.552627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.553030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.553078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.553528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.553545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.558303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.558700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.560414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.560471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.560963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.561364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.561760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.562159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.562215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.562489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.370 [2024-07-15 10:42:41.562507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.565673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.567347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.567747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.567794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.568234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.569840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.570244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.570635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.570683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.571085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.571103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.574145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.574551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.574600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.575000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.575275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.575678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.576071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.577840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.577894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.578386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.578404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.584472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.584877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.584924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.585328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.585720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.586133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.587759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.588151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.588200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.588603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.588620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.592709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.593114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.593781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.593829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.594112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.594515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.594907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.594967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.595356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.595825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.595841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.601243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.601308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.602070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.602117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.632 [2024-07-15 10:42:41.602403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.602809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.602855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.603738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.603792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.604158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.604175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.608634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.608689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.609514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.609562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.610006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.610408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.610457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.610842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.610891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.611178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.611196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.615627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.615682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.616858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.616907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.617290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.618851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.618900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.619299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.619343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.619773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.619789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.624519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.624574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.625031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.625080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.625355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.625760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.625811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.626209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.626264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.626693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.626709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.630811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.630873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.631272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.631322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.631683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.632801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.632848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.633235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.633281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.633568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.633584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.637096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.637150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.638772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.638826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.639309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.639710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.639758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.641316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.641367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.641642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.641658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.647904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.647965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.649265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.649311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.649747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.650156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.650202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.651739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.651786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.652266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.652285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.657436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.657495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.659098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.659143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.659416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.661027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.661074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.662313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.662359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.662772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.662790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.666854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.666912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.668516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.669246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.669521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.671161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.671214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.672892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.672956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.673228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.673245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.677500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.677557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.679070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.679115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.679437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.633 [2024-07-15 10:42:41.681057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.681105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.681145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.682738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.683111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.683129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.687977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.688995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.689012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.692658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.692707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.692751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.692800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.693222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.693277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.693318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.693359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.693399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.693728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.693748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.698943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.698994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.699039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.699081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.699353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.699415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.699458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.699499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.699539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.700014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.700032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.704852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.705130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.705147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.709864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.710142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.710158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.713796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.713852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.713896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.713942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.714213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.714272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.714314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.714354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.714394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.714742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.714759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.719994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.720893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.725505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.726433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.726495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.634 [2024-07-15 10:42:41.726539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.726810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.726875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.726919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.727316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.727361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.727740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.727757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.733843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.733898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.733947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.735502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.735820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.735880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.737478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.737525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.737565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.737835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.737852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.741073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.741125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.742140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.742187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.742512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.744135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.744183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.744223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.745810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.746136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.746155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.751234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.751633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.751684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.751727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.752099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.752156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.752198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.752239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.753063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.753494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.753511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.759559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.759615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.759660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.759710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.759997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.760055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.760102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.760142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.761496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.761837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.761853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.766774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.766831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.766871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.766912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.767190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.767249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.767290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.767331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.769039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.769537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.769553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.774940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.775334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.775765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.775782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.782529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.782586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.782628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.782669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.782946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.783006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.783047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.783088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.784007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.784284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.784300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.788383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.788439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.788485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.790251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.790526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.790582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.635 [2024-07-15 10:42:41.790631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.790678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.791908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.792237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.792254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.797639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.797690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.797739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.799376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.799845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.799899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.799948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.799989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.800774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.801097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.801115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.805609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.805658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.805699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.807296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.807658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.807719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.807765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.807806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.808200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.808607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.808624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.812311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.812362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.812403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.813944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.814261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.814321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.814367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.814408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.815995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.816269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.816286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.819398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.819450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.821039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.821084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.821356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.821416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.821457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.821497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.822489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.822764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.822780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.827035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.827086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.827475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.636 [2024-07-15 10:42:41.827519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.897 [2024-07-15 10:42:41.827922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.897 [2024-07-15 10:42:41.827984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.828025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.828066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.829368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.829643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.829660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.835342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.835397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.835446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.835833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.836285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.836350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.836392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.836778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.836821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.837221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.837238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.841938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.843542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.843589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.845190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.845522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.845587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.845980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.846026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.846413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.846899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.846916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.850570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.851916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.851968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.853556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.853829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.853889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.855376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.855421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.855804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.856248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.856266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.859727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.860669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.860720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.862231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.862505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.862562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.864225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.864281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.865912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.866322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.866340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.869797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.871409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.871457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.872405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.872688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.872743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.874141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.874185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.875809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.876089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.876106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.879173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.880784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.880831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.882437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.882715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.882776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.884096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.884142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.885480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.885755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.885776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.890231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.891751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.891800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.893381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.893652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.893711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.895389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.895433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.896702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.897014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.897031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.901032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.901428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.901484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.903118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.903393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.903450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.905057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.905103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.906870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.907244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.907261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.898 [2024-07-15 10:42:41.912055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.912451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.912495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.912882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.913168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.913223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.914572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.914622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.916230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.916505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.916521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.920672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.921072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.921120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.921162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.921570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.921621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.922013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.922057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.923553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.923864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.923880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.930462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.931054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.931101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.931487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.931883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.931940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.932330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.933278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.933325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.933656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.933672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.938569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.938970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.939357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.939744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.940019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.941369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.942974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.944585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.945296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.945567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.945584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.950164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.951134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.952499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.954108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.954381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.955848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.957166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.958499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.960121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.960394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.960410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.965230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.966846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.968453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.969188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.969459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.971102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.972816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.974587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.974987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.975450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.975467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.980576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.981556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.982900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.984538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.984811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.986268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.986660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.987051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.987434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.987869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.987887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.991527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.991934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.992325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.992710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.993178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.993580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.993979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.994379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.994768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.995203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.995221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.998718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.999120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.999516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:41.999904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:42.000336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:42.000734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.899 [2024-07-15 10:42:42.001124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.001515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.001910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.002300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.002317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.005065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.005463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.005853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.006246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.006686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.007092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.007486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.007872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.008267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.008733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.008751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.011404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.011465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.011852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.012248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.012792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.013199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.013586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.013633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.014023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.014507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.014525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.016887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.017282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.017670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.017717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.018120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.018524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.018574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.018967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.019352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.019795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.019812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.022481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.022874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.022921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.023318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.023767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.023829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.024231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.024619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.024669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.025108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.025125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.027796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.027845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.028236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.028625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.029071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.029487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.029879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.030270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.030317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.030714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.030730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.032994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.033384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.033771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.034159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.034502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.034908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.035306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.035694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.035740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.036181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.036198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.038530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.038920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.039318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.039707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.040101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.040523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.040915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.041303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.041349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.041838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.041856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.044162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.044554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.044945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.045334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.045811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.046233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.046624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.047011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.047057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.047494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.047512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.049847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.050243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.900 [2024-07-15 10:42:42.050629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.051029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.051417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.051818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.052208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.052595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.052642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.053092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.053110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.055635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.056033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.056424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.056479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.056863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.057271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.057662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.058053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.058100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.058569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.058586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.061217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.061606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.062469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.062518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.062797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.063691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.064734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.065125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.065175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.065603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.065620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.068399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.068790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.069181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.069229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.069631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.070045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.070445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.070832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.070879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.071281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.071298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.074036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.074425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.074809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.074855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.075218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.075623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.076013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.076398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.076445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.076891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.076908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.080348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.081100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.081149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.082491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.082764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.084471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.086256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.086652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.086696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.087136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.087158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.090778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.092385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:04.901 [2024-07-15 10:42:42.092431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.093426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.093702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.095022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.096630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.098235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.098281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.098660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.098676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.101356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.102737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.104344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.104390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.104660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.106401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.107456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.107504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.108857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.109136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.109153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.111599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.111648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.112042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.112088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.112361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.113951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.114000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.115728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.115773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.116049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.116066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.119171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.119219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.119606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.119650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.120092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.120495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.120539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.120930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.120976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.121250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.121266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.124354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.124403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.126010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.126055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.126331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.127944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.127991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.128376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.128420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.128865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.128883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.132503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.163 [2024-07-15 10:42:42.132553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.134165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.134211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.134623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.136315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.136363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.138021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.138067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.138340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.138356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.140844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.140891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.142170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.142215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.142526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.144155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.144202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.145800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.145845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.146199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.146217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.148413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.148461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.148849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.148893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.149282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.149681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.149727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.150787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.150833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.151147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.151164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.154094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.154144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.155738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.155788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.156069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.156875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.156922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.157315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.157361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.157759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.157776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.161474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.161527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.162920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.162971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.163251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.164613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.164662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.166252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.166298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.166570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.166589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.169283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.169335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.170746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.172353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.172626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.174269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.174323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.175356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.175402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.175724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.175740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.177528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.177576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.177966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.178024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.178518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.178916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.178969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.179010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.180403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.180677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.180694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.182977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.183247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.183264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.185614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.185670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.185720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.185764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.186142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.186196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.186237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.164 [2024-07-15 10:42:42.186277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.186317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.186633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.186649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.188869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.189139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.189156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.191327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.191371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.191411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.191453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.191892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.191949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.191992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.192034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.192076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.192350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.192367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.194940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.196893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.196944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.196999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.197040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.197526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.197578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.197620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.197662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.197704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.198107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.198124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.199666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.199712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.199757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.199798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.200077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.200133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.200175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.200215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.200262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.200537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.200553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.202428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.202820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.202864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.202905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.203369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.203426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.203468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.204622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.204668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.204990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.205007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.207906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.207960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.208001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.209611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.209885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.209950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.210499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.210544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.210584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.211029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.211047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.213141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.213184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.214772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.214817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.215092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.216377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.216424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.216465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.218182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.218459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.218475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.220393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.220789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.165 [2024-07-15 10:42:42.220832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.220873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.221345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.221400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.221443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.221485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.222793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.223106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.223123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.226589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.228338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.228702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.228718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.232980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.234588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.234987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.235007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.237799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.238191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.238642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.238659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.241986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.243593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.243867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.243883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.246374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.246423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.246469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.247957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.248267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.248326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.248367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.248411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.250012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.250284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.250301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.251964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.252008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.252059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.253626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.254031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.254085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.254126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.254167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.254554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.254989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.255006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.256915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.256964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.257004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.258606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.258992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.259053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.259098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.259138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.260493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.260766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.260782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.262767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.262811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.262852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.263245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.263674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.263733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.263778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.263820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.166 [2024-07-15 10:42:42.265616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.265895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.265912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.267562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.267612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.269225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.269270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.269541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.269602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.269643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.269683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.270825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.271232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.271250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.273455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.273499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.274916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.274965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.275238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.275297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.275339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.275382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.277141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.277509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.277525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.279396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.279443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.279484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.279872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.280317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.280372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.280414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.280800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.280846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.281122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.281138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.282770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.284197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.284244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.285823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.286103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.286166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.287683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.287729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.288119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.288565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.288583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.290630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.292243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.292288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.293890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.294354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.294416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.295965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.296015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.297616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.297889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.297905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.300039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.300432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.300476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.301967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.302276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.302335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.303942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.303987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.305585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.305939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.305956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.307502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.307970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.308014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.308402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.308773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.308824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.309216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.309263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.167 [2024-07-15 10:42:42.310521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.310834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.310850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.312465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.313810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.313855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.315463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.315734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.315794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.316499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.316546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.316937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.317347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.317364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.319412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.321115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.321159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.322032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.322303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.322366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.323979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.324026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.324987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.325464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.325480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.327715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.329069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.329115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.330723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.331000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.331059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.331766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.331811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.333148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.333419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.333436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.335435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.335828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.335871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.336262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.336541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.336597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.337957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.338002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.339607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.339879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.339895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.341545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.343183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.343235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.343277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.343656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.343709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.344103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.344146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.344533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.344975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.344996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.347872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.349343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.349391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.350841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.351118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.351180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.352774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.353384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.353429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.353879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.353896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.168 [2024-07-15 10:42:42.357517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.359173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.360901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.361486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.361762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.363371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.365064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.366855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.367254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.367709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.367727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.370373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.370768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.371162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.371553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.371917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.372328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.372716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.373107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.373496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.373955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.373973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.376548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.376942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.377334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.377727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.378192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.378596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.378985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.379372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.379767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.380129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.380147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.383492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.383888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.384290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.384679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.385145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.385547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.385958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.386355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.386744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.387131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.387150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.389821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.390217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.390605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.391000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.391403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.391808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.392205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.392602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.392996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.393347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.393364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.396068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.396463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.396857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.397252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.397695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.398098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.398487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.398887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.399289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.399780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.399803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.402471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.402883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.403279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.403670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.404093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.404500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.404892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.429 [2024-07-15 10:42:42.405290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.405682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.406144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.406163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.408772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.408821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.409213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.409600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.410000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.410404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.410791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.410842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.411234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.411673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.411691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.413916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.414312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.414700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.414760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.415181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.415586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.415641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.416033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.416424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.416897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.416914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.419535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.419930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.419977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.420369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.420802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.420878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.421278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.421663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.421710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.422107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.422125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.424708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.424757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.425149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.425538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.425864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.426276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.426668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.427060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.427107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.427565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.427582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.429846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.430240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.430633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.431026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.431413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.431838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.432237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.432624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.432670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.433125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.433143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.435448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.435840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.436232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.436619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.437014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.437420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.437804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.438195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.438241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.438690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.438707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.440502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.440896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.441292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.441682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.442136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.442539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.442935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.443326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.443373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.443818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.443835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.446364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.446758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.447149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.447545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.447951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.448355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.448751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.449147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.449196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.449573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.449589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.451841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.430 [2024-07-15 10:42:42.452238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.452626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.452670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.453115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.453516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.453911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.454307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.454357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.454801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.454820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.458382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.460008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.460736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.460783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.461062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.462676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.464368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.466121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.466175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.466586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.466603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.470205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.471813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.473423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.473469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.473766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.475040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.476388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.477996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.478041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.478315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.478332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.480770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.481694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.483051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.483099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.483372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.485002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.486295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.487781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.487829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.488157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.488173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.490354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.490742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.490788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.491536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.491817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.493578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.495336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.497013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.497071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.497403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.497420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.499339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.499731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.499777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.500174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.500614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.501966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.503318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.504929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.504976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.505247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.505264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.506945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.508590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.508991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.509036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.509471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.509869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.510261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.510308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.511573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.511878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.511894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.514823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.514871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.516458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.516502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.516772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.517683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.517741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.518137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.518181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.518596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.518613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.522213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.522264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.524022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.524066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.524430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.525780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.431 [2024-07-15 10:42:42.525827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.527437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.527483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.527754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.527770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.530336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.530387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.532105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.532157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.532428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.534037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.534084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.535680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.535724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.536195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.536212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.538086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.538136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.538522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.538565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.538943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.539347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.539394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.540649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.540694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.541022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.541039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.543942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.543992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.545593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.545638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.545908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.546665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.546710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.547100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.547144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.547532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.547549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.551157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.551207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.552661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.552704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.553014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.554376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.554423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.556024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.556069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.556340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.556356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.558606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.558678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.559980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.561303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.561349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.561617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:05.432 [2024-07-15 10:42:42.563828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:06.367 00:35:06.367 Latency(us) 00:35:06.367 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:06.367 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x0 length 0x100 00:35:06.367 crypto_ram : 6.18 41.44 2.59 0.00 0.00 2994161.75 322779.05 2655176.79 00:35:06.367 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x100 length 0x100 00:35:06.367 crypto_ram : 6.24 41.01 2.56 0.00 0.00 3032304.42 284483.23 2801065.63 00:35:06.367 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x0 length 0x100 00:35:06.367 crypto_ram1 : 6.18 41.43 2.59 0.00 0.00 2883380.98 320955.44 2436343.54 00:35:06.367 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x100 length 0x100 00:35:06.367 crypto_ram1 : 6.24 41.01 2.56 0.00 0.00 2921577.07 284483.23 2582232.38 00:35:06.367 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x0 length 0x100 00:35:06.367 crypto_ram2 : 5.62 257.51 16.09 0.00 0.00 440540.14 25416.57 696619.19 00:35:06.367 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x100 length 0x100 00:35:06.367 crypto_ram2 : 5.67 246.59 15.41 0.00 0.00 460531.54 95283.65 714855.29 00:35:06.367 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x0 length 0x100 00:35:06.367 crypto_ram3 : 5.76 268.19 16.76 0.00 0.00 407894.19 5185.89 510610.92 00:35:06.367 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:06.367 Verification LBA range: start 0x100 length 0x100 00:35:06.368 crypto_ram3 : 5.81 258.81 16.18 0.00 0.00 422609.90 53340.61 348309.59 00:35:06.368 =================================================================================================================== 00:35:06.368 Total : 1196.00 74.75 0.00 0.00 805958.12 5185.89 2801065.63 00:35:06.626 00:35:06.626 real 0m9.452s 00:35:06.626 user 0m17.918s 00:35:06.626 sys 0m0.471s 00:35:06.626 10:42:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:06.626 10:42:43 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:06.626 ************************************ 00:35:06.626 END TEST bdev_verify_big_io 00:35:06.626 ************************************ 00:35:06.626 10:42:43 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:06.626 10:42:43 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:06.626 10:42:43 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:06.626 10:42:43 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:06.626 10:42:43 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:06.626 ************************************ 00:35:06.626 START TEST bdev_write_zeroes 00:35:06.626 ************************************ 00:35:06.626 10:42:43 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:06.884 [2024-07-15 10:42:43.874163] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:06.884 [2024-07-15 10:42:43.874224] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid680404 ] 00:35:06.884 [2024-07-15 10:42:44.001272] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:07.144 [2024-07-15 10:42:44.102547] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:07.144 [2024-07-15 10:42:44.123842] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:35:07.144 [2024-07-15 10:42:44.131867] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:07.144 [2024-07-15 10:42:44.139886] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:07.144 [2024-07-15 10:42:44.254361] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:35:09.679 [2024-07-15 10:42:46.469235] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:35:09.679 [2024-07-15 10:42:46.469313] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:09.679 [2024-07-15 10:42:46.469329] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.679 [2024-07-15 10:42:46.477255] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:35:09.679 [2024-07-15 10:42:46.477274] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:09.679 [2024-07-15 10:42:46.477286] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.679 [2024-07-15 10:42:46.485275] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:35:09.679 [2024-07-15 10:42:46.485292] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:09.679 [2024-07-15 10:42:46.485304] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.679 [2024-07-15 10:42:46.493296] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:35:09.679 [2024-07-15 10:42:46.493313] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:09.679 [2024-07-15 10:42:46.493324] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:09.679 Running I/O for 1 seconds... 00:35:10.616 00:35:10.616 Latency(us) 00:35:10.616 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:10.616 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:10.616 crypto_ram : 1.03 2021.21 7.90 0.00 0.00 62908.40 5613.30 76135.74 00:35:10.616 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:10.616 crypto_ram1 : 1.03 2026.72 7.92 0.00 0.00 62370.63 5556.31 70209.00 00:35:10.616 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:10.616 crypto_ram2 : 1.02 15565.12 60.80 0.00 0.00 8096.90 2436.23 10656.72 00:35:10.616 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:10.616 crypto_ram3 : 1.02 15597.20 60.93 0.00 0.00 8054.28 2421.98 8434.20 00:35:10.616 =================================================================================================================== 00:35:10.616 Total : 35210.24 137.54 0.00 0.00 14375.53 2421.98 76135.74 00:35:10.875 00:35:10.875 real 0m4.177s 00:35:10.875 user 0m3.758s 00:35:10.875 sys 0m0.375s 00:35:10.875 10:42:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:10.875 10:42:47 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:10.875 ************************************ 00:35:10.875 END TEST bdev_write_zeroes 00:35:10.875 ************************************ 00:35:10.875 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:35:10.875 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:10.875 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:10.875 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:10.875 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:10.875 ************************************ 00:35:10.875 START TEST bdev_json_nonenclosed 00:35:10.875 ************************************ 00:35:10.875 10:42:48 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:11.133 [2024-07-15 10:42:48.121201] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:11.133 [2024-07-15 10:42:48.121259] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid680947 ] 00:35:11.133 [2024-07-15 10:42:48.248670] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:11.392 [2024-07-15 10:42:48.351462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:11.392 [2024-07-15 10:42:48.351529] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:11.392 [2024-07-15 10:42:48.351549] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:11.392 [2024-07-15 10:42:48.351561] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:11.392 00:35:11.392 real 0m0.400s 00:35:11.392 user 0m0.239s 00:35:11.392 sys 0m0.157s 00:35:11.392 10:42:48 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:11.392 10:42:48 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:11.392 10:42:48 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:11.392 ************************************ 00:35:11.392 END TEST bdev_json_nonenclosed 00:35:11.392 ************************************ 00:35:11.392 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:11.392 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:35:11.392 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:11.392 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:11.392 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:11.392 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:11.392 ************************************ 00:35:11.393 START TEST bdev_json_nonarray 00:35:11.393 ************************************ 00:35:11.393 10:42:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:11.393 [2024-07-15 10:42:48.590189] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:11.393 [2024-07-15 10:42:48.590248] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid681014 ] 00:35:11.652 [2024-07-15 10:42:48.718478] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:11.652 [2024-07-15 10:42:48.819321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:11.652 [2024-07-15 10:42:48.819398] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:11.652 [2024-07-15 10:42:48.819419] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:11.652 [2024-07-15 10:42:48.819431] app.c:1052:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:11.912 00:35:11.912 real 0m0.397s 00:35:11.912 user 0m0.233s 00:35:11.912 sys 0m0.160s 00:35:11.912 10:42:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:11.912 10:42:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:11.912 10:42:48 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:11.912 ************************************ 00:35:11.912 END TEST bdev_json_nonarray 00:35:11.912 ************************************ 00:35:11.912 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:35:11.912 10:42:48 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:35:11.912 00:35:11.912 real 1m11.762s 00:35:11.912 user 2m39.911s 00:35:11.912 sys 0m8.901s 00:35:11.913 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:11.913 10:42:48 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:35:11.913 ************************************ 00:35:11.913 END TEST blockdev_crypto_qat 00:35:11.913 ************************************ 00:35:11.913 10:42:49 -- common/autotest_common.sh@1142 -- # return 0 00:35:11.913 10:42:49 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:11.913 10:42:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:35:11.913 10:42:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:11.913 10:42:49 -- common/autotest_common.sh@10 -- # set +x 00:35:11.913 ************************************ 00:35:11.913 START TEST chaining 00:35:11.913 ************************************ 00:35:11.913 10:42:49 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:35:12.173 * Looking for test storage... 00:35:12.173 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:12.173 10:42:49 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@7 -- # uname -s 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809e2efd-7f71-e711-906e-0017a4403562 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=809e2efd-7f71-e711-906e-0017a4403562 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:12.173 10:42:49 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:35:12.173 10:42:49 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:12.173 10:42:49 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:12.173 10:42:49 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:12.173 10:42:49 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:12.173 10:42:49 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:12.173 10:42:49 chaining -- paths/export.sh@5 -- # export PATH 00:35:12.173 10:42:49 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@47 -- # : 0 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:35:12.173 10:42:49 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:35:12.173 10:42:49 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:35:12.173 10:42:49 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:35:12.173 10:42:49 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:35:12.173 10:42:49 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:35:12.173 10:42:49 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:12.173 10:42:49 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:12.173 10:42:49 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:12.173 10:42:49 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:12.173 10:42:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@336 -- # return 1 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:20.299 WARNING: No supported devices were found, fallback requested for tcp test 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:20.299 Cannot find device "nvmf_tgt_br" 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@155 -- # true 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:20.299 Cannot find device "nvmf_tgt_br2" 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@156 -- # true 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:20.299 Cannot find device "nvmf_tgt_br" 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@158 -- # true 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:20.299 Cannot find device "nvmf_tgt_br2" 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@159 -- # true 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:20.299 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@162 -- # true 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:20.299 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@163 -- # true 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:20.299 10:42:56 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:20.299 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:20.299 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.106 ms 00:35:20.299 00:35:20.299 --- 10.0.0.2 ping statistics --- 00:35:20.299 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:20.299 rtt min/avg/max/mdev = 0.106/0.106/0.106/0.000 ms 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:20.299 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:20.299 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.074 ms 00:35:20.299 00:35:20.299 --- 10.0.0.3 ping statistics --- 00:35:20.299 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:20.299 rtt min/avg/max/mdev = 0.074/0.074/0.074/0.000 ms 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:20.299 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:20.299 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.037 ms 00:35:20.299 00:35:20.299 --- 10.0.0.1 ping statistics --- 00:35:20.299 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:20.299 rtt min/avg/max/mdev = 0.037/0.037/0.037/0.000 ms 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@433 -- # return 0 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:20.299 10:42:57 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:35:20.299 10:42:57 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:20.300 10:42:57 chaining -- nvmf/common.sh@481 -- # nvmfpid=684800 00:35:20.300 10:42:57 chaining -- nvmf/common.sh@482 -- # waitforlisten 684800 00:35:20.300 10:42:57 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@829 -- # '[' -z 684800 ']' 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:20.300 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:20.300 10:42:57 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:20.300 [2024-07-15 10:42:57.286142] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:20.300 [2024-07-15 10:42:57.286278] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:20.300 [2024-07-15 10:42:57.485066] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:20.558 [2024-07-15 10:42:57.583299] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:20.558 [2024-07-15 10:42:57.583349] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:20.558 [2024-07-15 10:42:57.583364] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:20.558 [2024-07-15 10:42:57.583377] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:20.558 [2024-07-15 10:42:57.583388] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:20.558 [2024-07-15 10:42:57.583415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:21.126 10:42:58 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:21.126 10:42:58 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:21.126 10:42:58 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:21.126 10:42:58 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:21.126 10:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:21.126 10:42:58 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:21.126 10:42:58 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:21.126 10:42:58 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.uTtFtWoKsz 00:35:21.126 10:42:58 chaining -- bdev/chaining.sh@69 -- # mktemp 00:35:21.126 10:42:58 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.YRpOimiZhm 00:35:21.126 10:42:58 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:21.126 10:42:58 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:35:21.126 10:42:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:21.127 10:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:21.127 malloc0 00:35:21.127 true 00:35:21.127 true 00:35:21.127 [2024-07-15 10:42:58.254180] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:21.127 crypto0 00:35:21.127 [2024-07-15 10:42:58.262206] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:21.127 crypto1 00:35:21.127 [2024-07-15 10:42:58.270324] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:21.127 [2024-07-15 10:42:58.286558] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:21.127 10:42:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@85 -- # update_stats 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:21.127 10:42:58 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:21.127 10:42:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:21.127 10:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:21.127 10:42:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:21.386 10:42:58 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.uTtFtWoKsz bs=1K count=64 00:35:21.386 64+0 records in 00:35:21.386 64+0 records out 00:35:21.386 65536 bytes (66 kB, 64 KiB) copied, 0.00105708 s, 62.0 MB/s 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.uTtFtWoKsz --ob Nvme0n1 --bs 65536 --count 1 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@25 -- # local config 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:21.386 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:21.386 "subsystems": [ 00:35:21.386 { 00:35:21.386 "subsystem": "bdev", 00:35:21.386 "config": [ 00:35:21.386 { 00:35:21.386 "method": "bdev_nvme_attach_controller", 00:35:21.386 "params": { 00:35:21.386 "trtype": "tcp", 00:35:21.386 "adrfam": "IPv4", 00:35:21.386 "name": "Nvme0", 00:35:21.386 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:21.386 "traddr": "10.0.0.2", 00:35:21.386 "trsvcid": "4420" 00:35:21.386 } 00:35:21.386 }, 00:35:21.386 { 00:35:21.386 "method": "bdev_set_options", 00:35:21.386 "params": { 00:35:21.386 "bdev_auto_examine": false 00:35:21.386 } 00:35:21.386 } 00:35:21.386 ] 00:35:21.386 } 00:35:21.386 ] 00:35:21.386 }' 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:21.386 "subsystems": [ 00:35:21.386 { 00:35:21.386 "subsystem": "bdev", 00:35:21.386 "config": [ 00:35:21.386 { 00:35:21.386 "method": "bdev_nvme_attach_controller", 00:35:21.386 "params": { 00:35:21.386 "trtype": "tcp", 00:35:21.386 "adrfam": "IPv4", 00:35:21.386 "name": "Nvme0", 00:35:21.386 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:21.386 "traddr": "10.0.0.2", 00:35:21.386 "trsvcid": "4420" 00:35:21.386 } 00:35:21.386 }, 00:35:21.386 { 00:35:21.386 "method": "bdev_set_options", 00:35:21.386 "params": { 00:35:21.386 "bdev_auto_examine": false 00:35:21.386 } 00:35:21.386 } 00:35:21.386 ] 00:35:21.386 } 00:35:21.386 ] 00:35:21.386 }' 00:35:21.386 10:42:58 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.uTtFtWoKsz --ob Nvme0n1 --bs 65536 --count 1 00:35:21.645 [2024-07-15 10:42:58.613291] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:21.645 [2024-07-15 10:42:58.613361] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid685014 ] 00:35:21.645 [2024-07-15 10:42:58.741767] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:21.645 [2024-07-15 10:42:58.841087] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:22.163  Copying: 64/64 [kB] (average 12 MBps) 00:35:22.163 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.163 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.163 10:42:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@96 -- # update_stats 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:22.422 10:42:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.422 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:22.423 10:42:59 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:22.423 10:42:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:22.423 10:42:59 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.YRpOimiZhm --ib Nvme0n1 --bs 65536 --count 1 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@25 -- # local config 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:22.423 10:42:59 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:22.423 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:22.682 10:42:59 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:22.682 "subsystems": [ 00:35:22.682 { 00:35:22.682 "subsystem": "bdev", 00:35:22.682 "config": [ 00:35:22.682 { 00:35:22.682 "method": "bdev_nvme_attach_controller", 00:35:22.682 "params": { 00:35:22.682 "trtype": "tcp", 00:35:22.682 "adrfam": "IPv4", 00:35:22.682 "name": "Nvme0", 00:35:22.682 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:22.682 "traddr": "10.0.0.2", 00:35:22.682 "trsvcid": "4420" 00:35:22.682 } 00:35:22.682 }, 00:35:22.682 { 00:35:22.682 "method": "bdev_set_options", 00:35:22.682 "params": { 00:35:22.682 "bdev_auto_examine": false 00:35:22.682 } 00:35:22.682 } 00:35:22.682 ] 00:35:22.682 } 00:35:22.682 ] 00:35:22.682 }' 00:35:22.682 10:42:59 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.YRpOimiZhm --ib Nvme0n1 --bs 65536 --count 1 00:35:22.682 10:42:59 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:22.682 "subsystems": [ 00:35:22.682 { 00:35:22.682 "subsystem": "bdev", 00:35:22.682 "config": [ 00:35:22.682 { 00:35:22.682 "method": "bdev_nvme_attach_controller", 00:35:22.682 "params": { 00:35:22.682 "trtype": "tcp", 00:35:22.682 "adrfam": "IPv4", 00:35:22.682 "name": "Nvme0", 00:35:22.682 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:22.682 "traddr": "10.0.0.2", 00:35:22.682 "trsvcid": "4420" 00:35:22.682 } 00:35:22.682 }, 00:35:22.682 { 00:35:22.682 "method": "bdev_set_options", 00:35:22.682 "params": { 00:35:22.682 "bdev_auto_examine": false 00:35:22.682 } 00:35:22.682 } 00:35:22.682 ] 00:35:22.682 } 00:35:22.682 ] 00:35:22.682 }' 00:35:22.682 [2024-07-15 10:42:59.704464] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:22.682 [2024-07-15 10:42:59.704538] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid685104 ] 00:35:22.682 [2024-07-15 10:42:59.835522] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:22.940 [2024-07-15 10:42:59.932899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:23.206  Copying: 64/64 [kB] (average 15 MBps) 00:35:23.206 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:23.206 10:43:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:23.206 10:43:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:23.206 10:43:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.206 10:43:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:23.465 10:43:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.uTtFtWoKsz /tmp/tmp.YRpOimiZhm 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@25 -- # local config 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:23.465 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:23.465 "subsystems": [ 00:35:23.465 { 00:35:23.465 "subsystem": "bdev", 00:35:23.465 "config": [ 00:35:23.465 { 00:35:23.465 "method": "bdev_nvme_attach_controller", 00:35:23.465 "params": { 00:35:23.465 "trtype": "tcp", 00:35:23.465 "adrfam": "IPv4", 00:35:23.465 "name": "Nvme0", 00:35:23.465 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:23.465 "traddr": "10.0.0.2", 00:35:23.465 "trsvcid": "4420" 00:35:23.465 } 00:35:23.465 }, 00:35:23.465 { 00:35:23.465 "method": "bdev_set_options", 00:35:23.465 "params": { 00:35:23.465 "bdev_auto_examine": false 00:35:23.465 } 00:35:23.465 } 00:35:23.465 ] 00:35:23.465 } 00:35:23.465 ] 00:35:23.465 }' 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:35:23.465 10:43:00 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:23.465 "subsystems": [ 00:35:23.465 { 00:35:23.465 "subsystem": "bdev", 00:35:23.465 "config": [ 00:35:23.465 { 00:35:23.465 "method": "bdev_nvme_attach_controller", 00:35:23.465 "params": { 00:35:23.465 "trtype": "tcp", 00:35:23.465 "adrfam": "IPv4", 00:35:23.465 "name": "Nvme0", 00:35:23.465 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:23.465 "traddr": "10.0.0.2", 00:35:23.465 "trsvcid": "4420" 00:35:23.465 } 00:35:23.465 }, 00:35:23.465 { 00:35:23.465 "method": "bdev_set_options", 00:35:23.465 "params": { 00:35:23.465 "bdev_auto_examine": false 00:35:23.465 } 00:35:23.465 } 00:35:23.465 ] 00:35:23.465 } 00:35:23.465 ] 00:35:23.465 }' 00:35:23.465 [2024-07-15 10:43:00.655500] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:23.465 [2024-07-15 10:43:00.655569] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid685293 ] 00:35:23.724 [2024-07-15 10:43:00.787070] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:23.724 [2024-07-15 10:43:00.886616] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:24.240  Copying: 64/64 [kB] (average 12 MBps) 00:35:24.240 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@106 -- # update_stats 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:24.240 10:43:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.240 10:43:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.498 10:43:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:24.498 10:43:01 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:24.498 10:43:01 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:24.498 10:43:01 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.uTtFtWoKsz --ob Nvme0n1 --bs 4096 --count 16 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@25 -- # local config 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:24.498 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:24.498 "subsystems": [ 00:35:24.498 { 00:35:24.498 "subsystem": "bdev", 00:35:24.498 "config": [ 00:35:24.498 { 00:35:24.498 "method": "bdev_nvme_attach_controller", 00:35:24.498 "params": { 00:35:24.498 "trtype": "tcp", 00:35:24.498 "adrfam": "IPv4", 00:35:24.498 "name": "Nvme0", 00:35:24.498 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:24.498 "traddr": "10.0.0.2", 00:35:24.498 "trsvcid": "4420" 00:35:24.498 } 00:35:24.498 }, 00:35:24.498 { 00:35:24.498 "method": "bdev_set_options", 00:35:24.498 "params": { 00:35:24.498 "bdev_auto_examine": false 00:35:24.498 } 00:35:24.498 } 00:35:24.498 ] 00:35:24.498 } 00:35:24.498 ] 00:35:24.498 }' 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.uTtFtWoKsz --ob Nvme0n1 --bs 4096 --count 16 00:35:24.498 10:43:01 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:24.498 "subsystems": [ 00:35:24.498 { 00:35:24.498 "subsystem": "bdev", 00:35:24.498 "config": [ 00:35:24.498 { 00:35:24.498 "method": "bdev_nvme_attach_controller", 00:35:24.498 "params": { 00:35:24.498 "trtype": "tcp", 00:35:24.498 "adrfam": "IPv4", 00:35:24.498 "name": "Nvme0", 00:35:24.498 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:24.498 "traddr": "10.0.0.2", 00:35:24.498 "trsvcid": "4420" 00:35:24.498 } 00:35:24.498 }, 00:35:24.498 { 00:35:24.498 "method": "bdev_set_options", 00:35:24.498 "params": { 00:35:24.498 "bdev_auto_examine": false 00:35:24.498 } 00:35:24.498 } 00:35:24.498 ] 00:35:24.498 } 00:35:24.498 ] 00:35:24.498 }' 00:35:24.498 [2024-07-15 10:43:01.616853] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:24.498 [2024-07-15 10:43:01.616916] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid685483 ] 00:35:24.756 [2024-07-15 10:43:01.744746] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:24.756 [2024-07-15 10:43:01.841833] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:25.272  Copying: 64/64 [kB] (average 12 MBps) 00:35:25.272 00:35:25.272 10:43:02 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@114 -- # update_stats 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:25.273 10:43:02 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.273 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:25.532 10:43:02 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@117 -- # : 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.YRpOimiZhm --ib Nvme0n1 --bs 4096 --count 16 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@25 -- # local config 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:35:25.532 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@31 -- # config='{ 00:35:25.532 "subsystems": [ 00:35:25.532 { 00:35:25.532 "subsystem": "bdev", 00:35:25.532 "config": [ 00:35:25.532 { 00:35:25.532 "method": "bdev_nvme_attach_controller", 00:35:25.532 "params": { 00:35:25.532 "trtype": "tcp", 00:35:25.532 "adrfam": "IPv4", 00:35:25.532 "name": "Nvme0", 00:35:25.532 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:25.532 "traddr": "10.0.0.2", 00:35:25.532 "trsvcid": "4420" 00:35:25.532 } 00:35:25.532 }, 00:35:25.532 { 00:35:25.532 "method": "bdev_set_options", 00:35:25.532 "params": { 00:35:25.532 "bdev_auto_examine": false 00:35:25.532 } 00:35:25.532 } 00:35:25.532 ] 00:35:25.532 } 00:35:25.532 ] 00:35:25.532 }' 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.YRpOimiZhm --ib Nvme0n1 --bs 4096 --count 16 00:35:25.532 10:43:02 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:35:25.532 "subsystems": [ 00:35:25.532 { 00:35:25.532 "subsystem": "bdev", 00:35:25.532 "config": [ 00:35:25.532 { 00:35:25.532 "method": "bdev_nvme_attach_controller", 00:35:25.532 "params": { 00:35:25.532 "trtype": "tcp", 00:35:25.532 "adrfam": "IPv4", 00:35:25.532 "name": "Nvme0", 00:35:25.532 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:35:25.532 "traddr": "10.0.0.2", 00:35:25.532 "trsvcid": "4420" 00:35:25.532 } 00:35:25.532 }, 00:35:25.532 { 00:35:25.532 "method": "bdev_set_options", 00:35:25.532 "params": { 00:35:25.532 "bdev_auto_examine": false 00:35:25.532 } 00:35:25.532 } 00:35:25.532 ] 00:35:25.532 } 00:35:25.532 ] 00:35:25.532 }' 00:35:25.532 [2024-07-15 10:43:02.715480] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:25.532 [2024-07-15 10:43:02.715529] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid685633 ] 00:35:25.790 [2024-07-15 10:43:02.825229] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:25.790 [2024-07-15 10:43:02.924029] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:26.304  Copying: 64/64 [kB] (average 1361 kBps) 00:35:26.304 00:35:26.304 10:43:03 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:35:26.304 10:43:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.305 10:43:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.305 10:43:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.uTtFtWoKsz /tmp/tmp.YRpOimiZhm 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.uTtFtWoKsz /tmp/tmp.YRpOimiZhm 00:35:26.562 10:43:03 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@117 -- # sync 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@120 -- # set +e 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:26.562 rmmod nvme_tcp 00:35:26.562 rmmod nvme_fabrics 00:35:26.562 rmmod nvme_keyring 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@124 -- # set -e 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@125 -- # return 0 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@489 -- # '[' -n 684800 ']' 00:35:26.562 10:43:03 chaining -- nvmf/common.sh@490 -- # killprocess 684800 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@948 -- # '[' -z 684800 ']' 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@952 -- # kill -0 684800 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@953 -- # uname 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 684800 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 684800' 00:35:26.562 killing process with pid 684800 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@967 -- # kill 684800 00:35:26.562 10:43:03 chaining -- common/autotest_common.sh@972 -- # wait 684800 00:35:26.820 10:43:03 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:26.820 10:43:03 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:26.820 10:43:03 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:26.820 10:43:03 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:26.820 10:43:03 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:26.820 10:43:03 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:26.820 10:43:03 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:26.820 10:43:03 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:35:26.820 10:43:03 chaining -- bdev/chaining.sh@132 -- # bperfpid=685812 00:35:26.820 10:43:03 chaining -- bdev/chaining.sh@134 -- # waitforlisten 685812 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@829 -- # '[' -z 685812 ']' 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:26.820 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:26.820 10:43:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:26.820 10:43:03 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:27.077 [2024-07-15 10:43:04.037519] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:27.077 [2024-07-15 10:43:04.037591] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid685812 ] 00:35:27.077 [2024-07-15 10:43:04.179916] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:27.336 [2024-07-15 10:43:04.317494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:27.902 10:43:05 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:27.902 10:43:05 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:27.902 10:43:05 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:35:27.902 10:43:05 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:27.902 10:43:05 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:28.161 malloc0 00:35:28.161 true 00:35:28.161 true 00:35:28.161 [2024-07-15 10:43:05.203877] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:28.161 crypto0 00:35:28.161 [2024-07-15 10:43:05.211903] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:28.161 crypto1 00:35:28.161 10:43:05 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:28.161 10:43:05 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:28.418 Running I/O for 5 seconds... 00:35:33.674 00:35:33.674 Latency(us) 00:35:33.674 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:33.674 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:33.674 Verification LBA range: start 0x0 length 0x2000 00:35:33.675 crypto1 : 5.01 11464.83 44.78 0.00 0.00 22259.06 3376.53 14303.94 00:35:33.675 =================================================================================================================== 00:35:33.675 Total : 11464.83 44.78 0.00 0.00 22259.06 3376.53 14303.94 00:35:33.675 0 00:35:33.675 10:43:10 chaining -- bdev/chaining.sh@146 -- # killprocess 685812 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@948 -- # '[' -z 685812 ']' 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@952 -- # kill -0 685812 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@953 -- # uname 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 685812 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 685812' 00:35:33.675 killing process with pid 685812 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@967 -- # kill 685812 00:35:33.675 Received shutdown signal, test time was about 5.000000 seconds 00:35:33.675 00:35:33.675 Latency(us) 00:35:33.675 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:33.675 =================================================================================================================== 00:35:33.675 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@972 -- # wait 685812 00:35:33.675 10:43:10 chaining -- bdev/chaining.sh@152 -- # bperfpid=686626 00:35:33.675 10:43:10 chaining -- bdev/chaining.sh@154 -- # waitforlisten 686626 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@829 -- # '[' -z 686626 ']' 00:35:33.675 10:43:10 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:33.675 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:33.675 10:43:10 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:33.675 [2024-07-15 10:43:10.805089] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:33.675 [2024-07-15 10:43:10.805158] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid686626 ] 00:35:33.933 [2024-07-15 10:43:10.934132] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:33.933 [2024-07-15 10:43:11.042448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:34.192 10:43:11 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:34.192 10:43:11 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:34.192 10:43:11 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:35:34.192 10:43:11 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:34.192 10:43:11 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:34.451 malloc0 00:35:34.451 true 00:35:34.451 true 00:35:34.451 [2024-07-15 10:43:11.414239] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:35:34.451 [2024-07-15 10:43:11.414284] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:35:34.451 [2024-07-15 10:43:11.414306] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2279730 00:35:34.451 [2024-07-15 10:43:11.414318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:35:34.451 [2024-07-15 10:43:11.415382] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:35:34.451 [2024-07-15 10:43:11.415406] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:35:34.451 pt0 00:35:34.451 [2024-07-15 10:43:11.422270] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:34.451 crypto0 00:35:34.451 [2024-07-15 10:43:11.430289] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:35:34.451 crypto1 00:35:34.451 10:43:11 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:34.451 10:43:11 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:35:34.451 Running I/O for 5 seconds... 00:35:39.766 00:35:39.766 Latency(us) 00:35:39.766 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:39.766 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:39.766 Verification LBA range: start 0x0 length 0x2000 00:35:39.766 crypto1 : 5.02 9075.18 35.45 0.00 0.00 28132.75 6553.60 16868.40 00:35:39.766 =================================================================================================================== 00:35:39.766 Total : 9075.18 35.45 0.00 0.00 28132.75 6553.60 16868.40 00:35:39.766 0 00:35:39.766 10:43:16 chaining -- bdev/chaining.sh@167 -- # killprocess 686626 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@948 -- # '[' -z 686626 ']' 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@952 -- # kill -0 686626 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@953 -- # uname 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 686626 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 686626' 00:35:39.766 killing process with pid 686626 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@967 -- # kill 686626 00:35:39.766 Received shutdown signal, test time was about 5.000000 seconds 00:35:39.766 00:35:39.766 Latency(us) 00:35:39.766 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:39.766 =================================================================================================================== 00:35:39.766 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@972 -- # wait 686626 00:35:39.766 10:43:16 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:35:39.766 10:43:16 chaining -- bdev/chaining.sh@170 -- # killprocess 686626 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@948 -- # '[' -z 686626 ']' 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@952 -- # kill -0 686626 00:35:39.766 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (686626) - No such process 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 686626 is not found' 00:35:39.766 Process with pid 686626 is not found 00:35:39.766 10:43:16 chaining -- bdev/chaining.sh@171 -- # wait 686626 00:35:39.766 10:43:16 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:35:39.766 10:43:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@296 -- # e810=() 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@297 -- # x722=() 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@298 -- # mlx=() 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:35:39.766 10:43:16 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@336 -- # return 1 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:35:39.767 WARNING: No supported devices were found, fallback requested for tcp test 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:35:39.767 10:43:16 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:35:40.027 Cannot find device "nvmf_tgt_br" 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@155 -- # true 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:35:40.027 Cannot find device "nvmf_tgt_br2" 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@156 -- # true 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:35:40.027 Cannot find device "nvmf_tgt_br" 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@158 -- # true 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:35:40.027 Cannot find device "nvmf_tgt_br2" 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@159 -- # true 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:35:40.027 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@162 -- # true 00:35:40.027 10:43:17 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:35:40.027 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@163 -- # true 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:35:40.285 10:43:17 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:35:40.543 10:43:17 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:35:40.543 10:43:17 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:35:40.802 10:43:17 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:35:40.802 10:43:17 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:35:40.802 10:43:17 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:35:40.803 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:35:40.803 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.105 ms 00:35:40.803 00:35:40.803 --- 10.0.0.2 ping statistics --- 00:35:40.803 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:40.803 rtt min/avg/max/mdev = 0.105/0.105/0.105/0.000 ms 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:35:40.803 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:35:40.803 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.053 ms 00:35:40.803 00:35:40.803 --- 10.0.0.3 ping statistics --- 00:35:40.803 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:40.803 rtt min/avg/max/mdev = 0.053/0.053/0.053/0.000 ms 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:35:40.803 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:35:40.803 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.042 ms 00:35:40.803 00:35:40.803 --- 10.0.0.1 ping statistics --- 00:35:40.803 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:35:40.803 rtt min/avg/max/mdev = 0.042/0.042/0.042/0.000 ms 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@433 -- # return 0 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:35:40.803 10:43:17 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@481 -- # nvmfpid=687798 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@482 -- # waitforlisten 687798 00:35:40.803 10:43:17 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@829 -- # '[' -z 687798 ']' 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:40.803 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:40.803 10:43:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:40.803 [2024-07-15 10:43:17.908814] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:40.803 [2024-07-15 10:43:17.908885] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:41.062 [2024-07-15 10:43:18.035697] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:41.062 [2024-07-15 10:43:18.145706] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:35:41.062 [2024-07-15 10:43:18.145747] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:35:41.062 [2024-07-15 10:43:18.145763] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:35:41.062 [2024-07-15 10:43:18.145776] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:35:41.062 [2024-07-15 10:43:18.145787] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:35:41.062 [2024-07-15 10:43:18.145814] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:42.000 10:43:18 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:42.000 10:43:18 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:35:42.000 10:43:18 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:42.000 malloc0 00:35:42.000 [2024-07-15 10:43:18.906059] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:35:42.000 [2024-07-15 10:43:18.922266] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:42.000 10:43:18 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:35:42.000 10:43:18 chaining -- bdev/chaining.sh@189 -- # bperfpid=687958 00:35:42.000 10:43:18 chaining -- bdev/chaining.sh@191 -- # waitforlisten 687958 /var/tmp/bperf.sock 00:35:42.000 10:43:18 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@829 -- # '[' -z 687958 ']' 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:42.000 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:42.000 10:43:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:42.000 [2024-07-15 10:43:18.996948] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:42.000 [2024-07-15 10:43:18.997012] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid687958 ] 00:35:42.000 [2024-07-15 10:43:19.125270] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:42.260 [2024-07-15 10:43:19.231968] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:42.828 10:43:19 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:42.828 10:43:19 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:42.828 10:43:19 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:35:42.828 10:43:19 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:43.394 [2024-07-15 10:43:20.346855] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:43.394 nvme0n1 00:35:43.394 true 00:35:43.394 crypto0 00:35:43.394 10:43:20 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:43.394 Running I/O for 5 seconds... 00:35:48.657 00:35:48.657 Latency(us) 00:35:48.657 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:48.657 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:35:48.657 Verification LBA range: start 0x0 length 0x2000 00:35:48.657 crypto0 : 5.02 8306.81 32.45 0.00 0.00 30715.49 2991.86 30317.52 00:35:48.657 =================================================================================================================== 00:35:48.657 Total : 8306.81 32.45 0.00 0.00 30715.49 2991.86 30317.52 00:35:48.657 0 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@205 -- # sequence=83372 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:48.657 10:43:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@206 -- # encrypt=41686 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:48.915 10:43:26 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@207 -- # decrypt=41686 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:49.173 10:43:26 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:49.431 10:43:26 chaining -- bdev/chaining.sh@208 -- # crc32c=83372 00:35:49.431 10:43:26 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:35:49.431 10:43:26 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:35:49.431 10:43:26 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:35:49.431 10:43:26 chaining -- bdev/chaining.sh@214 -- # killprocess 687958 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@948 -- # '[' -z 687958 ']' 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@952 -- # kill -0 687958 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@953 -- # uname 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 687958 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 687958' 00:35:49.431 killing process with pid 687958 00:35:49.431 10:43:26 chaining -- common/autotest_common.sh@967 -- # kill 687958 00:35:49.431 Received shutdown signal, test time was about 5.000000 seconds 00:35:49.431 00:35:49.431 Latency(us) 00:35:49.431 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:49.431 =================================================================================================================== 00:35:49.431 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:49.432 10:43:26 chaining -- common/autotest_common.sh@972 -- # wait 687958 00:35:49.690 10:43:26 chaining -- bdev/chaining.sh@219 -- # bperfpid=689018 00:35:49.690 10:43:26 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:35:49.690 10:43:26 chaining -- bdev/chaining.sh@221 -- # waitforlisten 689018 /var/tmp/bperf.sock 00:35:49.690 10:43:26 chaining -- common/autotest_common.sh@829 -- # '[' -z 689018 ']' 00:35:49.690 10:43:26 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:35:49.690 10:43:26 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:49.690 10:43:26 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:35:49.690 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:35:49.690 10:43:26 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:49.690 10:43:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:49.948 [2024-07-15 10:43:26.894664] Starting SPDK v24.09-pre git sha1 719d03c6a / DPDK 24.03.0 initialization... 00:35:49.948 [2024-07-15 10:43:26.894741] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid689018 ] 00:35:49.948 [2024-07-15 10:43:27.024329] app.c: 908:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:49.948 [2024-07-15 10:43:27.120072] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:50.883 10:43:27 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:50.883 10:43:27 chaining -- common/autotest_common.sh@862 -- # return 0 00:35:50.883 10:43:27 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:35:50.883 10:43:27 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:35:51.165 [2024-07-15 10:43:28.226351] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:35:51.165 nvme0n1 00:35:51.165 true 00:35:51.165 crypto0 00:35:51.165 10:43:28 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:35:51.165 Running I/O for 5 seconds... 00:35:56.426 00:35:56.426 Latency(us) 00:35:56.426 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:56.426 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:35:56.426 Verification LBA range: start 0x0 length 0x200 00:35:56.426 crypto0 : 5.01 1692.45 105.78 0.00 0.00 18529.76 391.79 20971.52 00:35:56.426 =================================================================================================================== 00:35:56.426 Total : 1692.45 105.78 0.00 0.00 18529.76 391.79 20971.52 00:35:56.426 0 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@39 -- # opcode= 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:35:56.426 10:43:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@233 -- # sequence=16944 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:56.684 10:43:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@234 -- # encrypt=8472 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:35:56.941 10:43:33 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@235 -- # decrypt=8472 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:35:57.200 10:43:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:35:57.458 10:43:34 chaining -- bdev/chaining.sh@236 -- # crc32c=16944 00:35:57.458 10:43:34 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:35:57.458 10:43:34 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:35:57.458 10:43:34 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:35:57.458 10:43:34 chaining -- bdev/chaining.sh@242 -- # killprocess 689018 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 689018 ']' 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@952 -- # kill -0 689018 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@953 -- # uname 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 689018 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 689018' 00:35:57.458 killing process with pid 689018 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@967 -- # kill 689018 00:35:57.458 Received shutdown signal, test time was about 5.000000 seconds 00:35:57.458 00:35:57.458 Latency(us) 00:35:57.458 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:57.458 =================================================================================================================== 00:35:57.458 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:35:57.458 10:43:34 chaining -- common/autotest_common.sh@972 -- # wait 689018 00:35:57.715 10:43:34 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@117 -- # sync 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@120 -- # set +e 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:35:57.715 rmmod nvme_tcp 00:35:57.715 rmmod nvme_fabrics 00:35:57.715 rmmod nvme_keyring 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@124 -- # set -e 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@125 -- # return 0 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@489 -- # '[' -n 687798 ']' 00:35:57.715 10:43:34 chaining -- nvmf/common.sh@490 -- # killprocess 687798 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 687798 ']' 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@952 -- # kill -0 687798 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@953 -- # uname 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 687798 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 687798' 00:35:57.715 killing process with pid 687798 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@967 -- # kill 687798 00:35:57.715 10:43:34 chaining -- common/autotest_common.sh@972 -- # wait 687798 00:35:57.974 10:43:35 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:35:57.974 10:43:35 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:35:57.974 10:43:35 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:35:57.974 10:43:35 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:35:57.974 10:43:35 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:35:57.974 10:43:35 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:35:57.974 10:43:35 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:35:57.974 10:43:35 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:35:57.974 10:43:35 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:35:57.974 10:43:35 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:35:57.974 00:35:57.974 real 0m46.052s 00:35:57.974 user 0m59.822s 00:35:57.974 sys 0m13.480s 00:35:57.974 10:43:35 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:57.974 10:43:35 chaining -- common/autotest_common.sh@10 -- # set +x 00:35:57.974 ************************************ 00:35:57.974 END TEST chaining 00:35:57.974 ************************************ 00:35:57.974 10:43:35 -- common/autotest_common.sh@1142 -- # return 0 00:35:57.974 10:43:35 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:35:57.974 10:43:35 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:35:57.974 10:43:35 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:35:57.974 10:43:35 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:35:57.974 10:43:35 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:35:57.974 10:43:35 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:35:57.974 10:43:35 -- common/autotest_common.sh@722 -- # xtrace_disable 00:35:57.974 10:43:35 -- common/autotest_common.sh@10 -- # set +x 00:35:57.974 10:43:35 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:35:57.974 10:43:35 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:35:57.974 10:43:35 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:35:57.974 10:43:35 -- common/autotest_common.sh@10 -- # set +x 00:36:03.292 INFO: APP EXITING 00:36:03.292 INFO: killing all VMs 00:36:03.292 INFO: killing vhost app 00:36:03.292 INFO: EXIT DONE 00:36:06.579 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:06.579 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:06.579 Waiting for block devices as requested 00:36:06.579 0000:5e:00.0 (8086 0b60): vfio-pci -> nvme 00:36:06.579 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:06.579 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:06.579 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:06.579 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:06.835 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:06.835 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:06.835 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:07.092 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:07.092 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:36:07.092 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:36:07.350 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:36:07.350 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:36:07.350 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:36:07.608 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:36:07.608 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:36:07.608 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:36:11.793 0000:d7:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:d7:05.5 00:36:11.793 0000:85:05.5 (8086 201d): Skipping not allowed VMD controller at 0000:85:05.5 00:36:11.793 Cleaning 00:36:11.793 Removing: /var/run/dpdk/spdk0/config 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:36:11.793 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:36:11.793 Removing: /var/run/dpdk/spdk0/hugepage_info 00:36:11.793 Removing: /dev/shm/nvmf_trace.0 00:36:11.793 Removing: /dev/shm/spdk_tgt_trace.pid433683 00:36:11.793 Removing: /var/run/dpdk/spdk0 00:36:11.793 Removing: /var/run/dpdk/spdk_pid432754 00:36:11.793 Removing: /var/run/dpdk/spdk_pid433683 00:36:11.793 Removing: /var/run/dpdk/spdk_pid434230 00:36:11.793 Removing: /var/run/dpdk/spdk_pid434957 00:36:11.793 Removing: /var/run/dpdk/spdk_pid435145 00:36:11.793 Removing: /var/run/dpdk/spdk_pid435979 00:36:11.793 Removing: /var/run/dpdk/spdk_pid436208 00:36:11.793 Removing: /var/run/dpdk/spdk_pid436490 00:36:11.793 Removing: /var/run/dpdk/spdk_pid439420 00:36:11.793 Removing: /var/run/dpdk/spdk_pid440772 00:36:11.793 Removing: /var/run/dpdk/spdk_pid441053 00:36:11.793 Removing: /var/run/dpdk/spdk_pid441291 00:36:11.793 Removing: /var/run/dpdk/spdk_pid441635 00:36:11.793 Removing: /var/run/dpdk/spdk_pid441947 00:36:11.793 Removing: /var/run/dpdk/spdk_pid442146 00:36:11.793 Removing: /var/run/dpdk/spdk_pid442346 00:36:11.793 Removing: /var/run/dpdk/spdk_pid442567 00:36:11.793 Removing: /var/run/dpdk/spdk_pid443315 00:36:11.793 Removing: /var/run/dpdk/spdk_pid446020 00:36:11.793 Removing: /var/run/dpdk/spdk_pid446216 00:36:11.793 Removing: /var/run/dpdk/spdk_pid446456 00:36:11.793 Removing: /var/run/dpdk/spdk_pid446809 00:36:11.793 Removing: /var/run/dpdk/spdk_pid446860 00:36:11.793 Removing: /var/run/dpdk/spdk_pid447089 00:36:11.793 Removing: /var/run/dpdk/spdk_pid447283 00:36:11.793 Removing: /var/run/dpdk/spdk_pid447480 00:36:11.793 Removing: /var/run/dpdk/spdk_pid447676 00:36:11.793 Removing: /var/run/dpdk/spdk_pid447879 00:36:11.793 Removing: /var/run/dpdk/spdk_pid448181 00:36:11.793 Removing: /var/run/dpdk/spdk_pid448430 00:36:11.793 Removing: /var/run/dpdk/spdk_pid448643 00:36:11.793 Removing: /var/run/dpdk/spdk_pid448840 00:36:11.793 Removing: /var/run/dpdk/spdk_pid449037 00:36:11.793 Removing: /var/run/dpdk/spdk_pid449244 00:36:11.793 Removing: /var/run/dpdk/spdk_pid449598 00:36:11.793 Removing: /var/run/dpdk/spdk_pid449800 00:36:11.793 Removing: /var/run/dpdk/spdk_pid450005 00:36:11.793 Removing: /var/run/dpdk/spdk_pid450202 00:36:11.793 Removing: /var/run/dpdk/spdk_pid450404 00:36:11.793 Removing: /var/run/dpdk/spdk_pid450710 00:36:11.793 Removing: /var/run/dpdk/spdk_pid450957 00:36:11.793 Removing: /var/run/dpdk/spdk_pid451164 00:36:11.793 Removing: /var/run/dpdk/spdk_pid451357 00:36:11.793 Removing: /var/run/dpdk/spdk_pid451556 00:36:11.793 Removing: /var/run/dpdk/spdk_pid451914 00:36:11.793 Removing: /var/run/dpdk/spdk_pid452121 00:36:11.793 Removing: /var/run/dpdk/spdk_pid452485 00:36:11.793 Removing: /var/run/dpdk/spdk_pid452848 00:36:11.793 Removing: /var/run/dpdk/spdk_pid453055 00:36:11.793 Removing: /var/run/dpdk/spdk_pid453428 00:36:11.793 Removing: /var/run/dpdk/spdk_pid453792 00:36:11.793 Removing: /var/run/dpdk/spdk_pid454004 00:36:11.793 Removing: /var/run/dpdk/spdk_pid454229 00:36:11.793 Removing: /var/run/dpdk/spdk_pid454558 00:36:11.793 Removing: /var/run/dpdk/spdk_pid454952 00:36:11.793 Removing: /var/run/dpdk/spdk_pid455332 00:36:11.793 Removing: /var/run/dpdk/spdk_pid455527 00:36:11.793 Removing: /var/run/dpdk/spdk_pid459549 00:36:11.793 Removing: /var/run/dpdk/spdk_pid461375 00:36:11.793 Removing: /var/run/dpdk/spdk_pid463128 00:36:11.793 Removing: /var/run/dpdk/spdk_pid464340 00:36:11.793 Removing: /var/run/dpdk/spdk_pid465524 00:36:11.793 Removing: /var/run/dpdk/spdk_pid465735 00:36:11.793 Removing: /var/run/dpdk/spdk_pid465926 00:36:11.793 Removing: /var/run/dpdk/spdk_pid465950 00:36:11.793 Removing: /var/run/dpdk/spdk_pid469736 00:36:11.793 Removing: /var/run/dpdk/spdk_pid470281 00:36:11.793 Removing: /var/run/dpdk/spdk_pid471182 00:36:11.793 Removing: /var/run/dpdk/spdk_pid471388 00:36:11.793 Removing: /var/run/dpdk/spdk_pid476707 00:36:11.793 Removing: /var/run/dpdk/spdk_pid478335 00:36:11.793 Removing: /var/run/dpdk/spdk_pid479198 00:36:11.793 Removing: /var/run/dpdk/spdk_pid483388 00:36:11.793 Removing: /var/run/dpdk/spdk_pid485137 00:36:11.793 Removing: /var/run/dpdk/spdk_pid485985 00:36:11.793 Removing: /var/run/dpdk/spdk_pid490499 00:36:11.793 Removing: /var/run/dpdk/spdk_pid492999 00:36:11.793 Removing: /var/run/dpdk/spdk_pid493974 00:36:11.793 Removing: /var/run/dpdk/spdk_pid503582 00:36:11.793 Removing: /var/run/dpdk/spdk_pid505700 00:36:11.793 Removing: /var/run/dpdk/spdk_pid506815 00:36:11.793 Removing: /var/run/dpdk/spdk_pid516466 00:36:11.793 Removing: /var/run/dpdk/spdk_pid519213 00:36:11.793 Removing: /var/run/dpdk/spdk_pid520196 00:36:11.793 Removing: /var/run/dpdk/spdk_pid529921 00:36:11.793 Removing: /var/run/dpdk/spdk_pid533354 00:36:11.793 Removing: /var/run/dpdk/spdk_pid534335 00:36:11.793 Removing: /var/run/dpdk/spdk_pid545414 00:36:11.793 Removing: /var/run/dpdk/spdk_pid548054 00:36:11.793 Removing: /var/run/dpdk/spdk_pid549198 00:36:11.793 Removing: /var/run/dpdk/spdk_pid559780 00:36:11.793 Removing: /var/run/dpdk/spdk_pid562209 00:36:12.051 Removing: /var/run/dpdk/spdk_pid563203 00:36:12.051 Removing: /var/run/dpdk/spdk_pid574527 00:36:12.051 Removing: /var/run/dpdk/spdk_pid578299 00:36:12.051 Removing: /var/run/dpdk/spdk_pid579283 00:36:12.051 Removing: /var/run/dpdk/spdk_pid580420 00:36:12.051 Removing: /var/run/dpdk/spdk_pid583479 00:36:12.051 Removing: /var/run/dpdk/spdk_pid588523 00:36:12.051 Removing: /var/run/dpdk/spdk_pid591047 00:36:12.051 Removing: /var/run/dpdk/spdk_pid595546 00:36:12.051 Removing: /var/run/dpdk/spdk_pid599566 00:36:12.051 Removing: /var/run/dpdk/spdk_pid604819 00:36:12.051 Removing: /var/run/dpdk/spdk_pid607697 00:36:12.051 Removing: /var/run/dpdk/spdk_pid614006 00:36:12.051 Removing: /var/run/dpdk/spdk_pid616262 00:36:12.051 Removing: /var/run/dpdk/spdk_pid622375 00:36:12.051 Removing: /var/run/dpdk/spdk_pid625040 00:36:12.051 Removing: /var/run/dpdk/spdk_pid631142 00:36:12.051 Removing: /var/run/dpdk/spdk_pid633402 00:36:12.051 Removing: /var/run/dpdk/spdk_pid637544 00:36:12.051 Removing: /var/run/dpdk/spdk_pid637894 00:36:12.051 Removing: /var/run/dpdk/spdk_pid638252 00:36:12.051 Removing: /var/run/dpdk/spdk_pid638613 00:36:12.051 Removing: /var/run/dpdk/spdk_pid639070 00:36:12.051 Removing: /var/run/dpdk/spdk_pid639821 00:36:12.051 Removing: /var/run/dpdk/spdk_pid640663 00:36:12.051 Removing: /var/run/dpdk/spdk_pid641100 00:36:12.051 Removing: /var/run/dpdk/spdk_pid642536 00:36:12.051 Removing: /var/run/dpdk/spdk_pid644135 00:36:12.051 Removing: /var/run/dpdk/spdk_pid645790 00:36:12.051 Removing: /var/run/dpdk/spdk_pid647046 00:36:12.051 Removing: /var/run/dpdk/spdk_pid648651 00:36:12.051 Removing: /var/run/dpdk/spdk_pid650845 00:36:12.051 Removing: /var/run/dpdk/spdk_pid652503 00:36:12.051 Removing: /var/run/dpdk/spdk_pid653789 00:36:12.051 Removing: /var/run/dpdk/spdk_pid654361 00:36:12.051 Removing: /var/run/dpdk/spdk_pid654734 00:36:12.051 Removing: /var/run/dpdk/spdk_pid656908 00:36:12.051 Removing: /var/run/dpdk/spdk_pid658761 00:36:12.051 Removing: /var/run/dpdk/spdk_pid660604 00:36:12.051 Removing: /var/run/dpdk/spdk_pid661677 00:36:12.051 Removing: /var/run/dpdk/spdk_pid662741 00:36:12.051 Removing: /var/run/dpdk/spdk_pid663289 00:36:12.051 Removing: /var/run/dpdk/spdk_pid663422 00:36:12.051 Removing: /var/run/dpdk/spdk_pid663541 00:36:12.051 Removing: /var/run/dpdk/spdk_pid663756 00:36:12.051 Removing: /var/run/dpdk/spdk_pid663929 00:36:12.051 Removing: /var/run/dpdk/spdk_pid665150 00:36:12.051 Removing: /var/run/dpdk/spdk_pid666670 00:36:12.051 Removing: /var/run/dpdk/spdk_pid668176 00:36:12.051 Removing: /var/run/dpdk/spdk_pid668895 00:36:12.051 Removing: /var/run/dpdk/spdk_pid669643 00:36:12.051 Removing: /var/run/dpdk/spdk_pid669976 00:36:12.051 Removing: /var/run/dpdk/spdk_pid670004 00:36:12.051 Removing: /var/run/dpdk/spdk_pid670046 00:36:12.051 Removing: /var/run/dpdk/spdk_pid670977 00:36:12.051 Removing: /var/run/dpdk/spdk_pid671522 00:36:12.051 Removing: /var/run/dpdk/spdk_pid672054 00:36:12.051 Removing: /var/run/dpdk/spdk_pid674028 00:36:12.309 Removing: /var/run/dpdk/spdk_pid676365 00:36:12.309 Removing: /var/run/dpdk/spdk_pid678107 00:36:12.309 Removing: /var/run/dpdk/spdk_pid679171 00:36:12.309 Removing: /var/run/dpdk/spdk_pid680404 00:36:12.309 Removing: /var/run/dpdk/spdk_pid680947 00:36:12.309 Removing: /var/run/dpdk/spdk_pid681014 00:36:12.309 Removing: /var/run/dpdk/spdk_pid685014 00:36:12.309 Removing: /var/run/dpdk/spdk_pid685104 00:36:12.309 Removing: /var/run/dpdk/spdk_pid685293 00:36:12.309 Removing: /var/run/dpdk/spdk_pid685483 00:36:12.309 Removing: /var/run/dpdk/spdk_pid685633 00:36:12.309 Removing: /var/run/dpdk/spdk_pid685812 00:36:12.309 Removing: /var/run/dpdk/spdk_pid686626 00:36:12.309 Removing: /var/run/dpdk/spdk_pid687958 00:36:12.309 Removing: /var/run/dpdk/spdk_pid689018 00:36:12.309 Clean 00:36:12.309 10:43:49 -- common/autotest_common.sh@1451 -- # return 0 00:36:12.309 10:43:49 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:36:12.309 10:43:49 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:12.309 10:43:49 -- common/autotest_common.sh@10 -- # set +x 00:36:12.309 10:43:49 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:36:12.309 10:43:49 -- common/autotest_common.sh@728 -- # xtrace_disable 00:36:12.309 10:43:49 -- common/autotest_common.sh@10 -- # set +x 00:36:12.567 10:43:49 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:12.567 10:43:49 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:36:12.567 10:43:49 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:36:12.567 10:43:49 -- spdk/autotest.sh@391 -- # hash lcov 00:36:12.567 10:43:49 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:36:12.567 10:43:49 -- spdk/autotest.sh@393 -- # hostname 00:36:12.567 10:43:49 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-50 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:36:12.567 geninfo: WARNING: invalid characters removed from testname! 00:36:44.635 10:44:16 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:44.635 10:44:20 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:46.012 10:44:23 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:48.545 10:44:25 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:51.109 10:44:28 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:54.399 10:44:30 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:36:56.304 10:44:33 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:36:56.562 10:44:33 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:36:56.562 10:44:33 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:36:56.562 10:44:33 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:36:56.562 10:44:33 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:36:56.562 10:44:33 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:56.562 10:44:33 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:56.562 10:44:33 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:56.562 10:44:33 -- paths/export.sh@5 -- $ export PATH 00:36:56.562 10:44:33 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:36:56.562 10:44:33 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:56.562 10:44:33 -- common/autobuild_common.sh@444 -- $ date +%s 00:36:56.562 10:44:33 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721033073.XXXXXX 00:36:56.562 10:44:33 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721033073.6aAeED 00:36:56.562 10:44:33 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:36:56.562 10:44:33 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:36:56.562 10:44:33 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:36:56.562 10:44:33 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:36:56.562 10:44:33 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:36:56.562 10:44:33 -- common/autobuild_common.sh@460 -- $ get_config_params 00:36:56.562 10:44:33 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:36:56.562 10:44:33 -- common/autotest_common.sh@10 -- $ set +x 00:36:56.562 10:44:33 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:36:56.562 10:44:33 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:36:56.562 10:44:33 -- pm/common@17 -- $ local monitor 00:36:56.562 10:44:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:56.562 10:44:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:56.562 10:44:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:56.562 10:44:33 -- pm/common@21 -- $ date +%s 00:36:56.562 10:44:33 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:56.562 10:44:33 -- pm/common@21 -- $ date +%s 00:36:56.562 10:44:33 -- pm/common@25 -- $ sleep 1 00:36:56.562 10:44:33 -- pm/common@21 -- $ date +%s 00:36:56.562 10:44:33 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721033073 00:36:56.562 10:44:33 -- pm/common@21 -- $ date +%s 00:36:56.562 10:44:33 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721033073 00:36:56.562 10:44:33 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721033073 00:36:56.562 10:44:33 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721033073 00:36:56.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721033073_collect-vmstat.pm.log 00:36:56.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721033073_collect-cpu-load.pm.log 00:36:56.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721033073_collect-cpu-temp.pm.log 00:36:56.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721033073_collect-bmc-pm.bmc.pm.log 00:36:57.499 10:44:34 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:36:57.499 10:44:34 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:36:57.499 10:44:34 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:57.499 10:44:34 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:36:57.499 10:44:34 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:36:57.499 10:44:34 -- spdk/autopackage.sh@19 -- $ timing_finish 00:36:57.499 10:44:34 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:36:57.499 10:44:34 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:36:57.499 10:44:34 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:36:57.499 10:44:34 -- spdk/autopackage.sh@20 -- $ exit 0 00:36:57.499 10:44:34 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:36:57.499 10:44:34 -- pm/common@29 -- $ signal_monitor_resources TERM 00:36:57.499 10:44:34 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:36:57.499 10:44:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:57.499 10:44:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:36:57.499 10:44:34 -- pm/common@44 -- $ pid=699614 00:36:57.499 10:44:34 -- pm/common@50 -- $ kill -TERM 699614 00:36:57.499 10:44:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:57.499 10:44:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:36:57.499 10:44:34 -- pm/common@44 -- $ pid=699616 00:36:57.499 10:44:34 -- pm/common@50 -- $ kill -TERM 699616 00:36:57.499 10:44:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:57.499 10:44:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:36:57.499 10:44:34 -- pm/common@44 -- $ pid=699618 00:36:57.499 10:44:34 -- pm/common@50 -- $ kill -TERM 699618 00:36:57.499 10:44:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:36:57.499 10:44:34 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:36:57.499 10:44:34 -- pm/common@44 -- $ pid=699641 00:36:57.499 10:44:34 -- pm/common@50 -- $ sudo -E kill -TERM 699641 00:36:57.758 + [[ -n 317971 ]] 00:36:57.758 + sudo kill 317971 00:36:57.768 [Pipeline] } 00:36:57.786 [Pipeline] // stage 00:36:57.791 [Pipeline] } 00:36:57.811 [Pipeline] // timeout 00:36:57.819 [Pipeline] } 00:36:57.836 [Pipeline] // catchError 00:36:57.842 [Pipeline] } 00:36:57.859 [Pipeline] // wrap 00:36:57.865 [Pipeline] } 00:36:57.881 [Pipeline] // catchError 00:36:57.890 [Pipeline] stage 00:36:57.893 [Pipeline] { (Epilogue) 00:36:57.908 [Pipeline] catchError 00:36:57.910 [Pipeline] { 00:36:57.924 [Pipeline] echo 00:36:57.926 Cleanup processes 00:36:57.933 [Pipeline] sh 00:36:58.217 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:58.217 699722 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:36:58.217 699935 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:58.231 [Pipeline] sh 00:36:58.514 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:58.514 ++ grep -v 'sudo pgrep' 00:36:58.514 ++ awk '{print $1}' 00:36:58.514 + sudo kill -9 699722 00:36:58.526 [Pipeline] sh 00:36:58.804 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:37:11.013 [Pipeline] sh 00:37:11.334 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:37:11.334 Artifacts sizes are good 00:37:11.349 [Pipeline] archiveArtifacts 00:37:11.357 Archiving artifacts 00:37:11.660 [Pipeline] sh 00:37:11.943 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:37:11.957 [Pipeline] cleanWs 00:37:11.967 [WS-CLEANUP] Deleting project workspace... 00:37:11.967 [WS-CLEANUP] Deferred wipeout is used... 00:37:11.974 [WS-CLEANUP] done 00:37:11.975 [Pipeline] } 00:37:11.997 [Pipeline] // catchError 00:37:12.010 [Pipeline] sh 00:37:12.287 + logger -p user.info -t JENKINS-CI 00:37:12.297 [Pipeline] } 00:37:12.314 [Pipeline] // stage 00:37:12.320 [Pipeline] } 00:37:12.337 [Pipeline] // node 00:37:12.344 [Pipeline] End of Pipeline 00:37:12.373 Finished: SUCCESS